Job Title:
Data Engineer
Location:
Hillsboro, Oregon (This position is eligible for part-time work from home)
Position Duties:
Work with Solution Architect & Staff Foundry Engineers to modernize data analytics system and capability for the Global Operations and Quality engineering departments.
Responsibilities:
• Design, develop and manage tools leveraging AI/ML & big data technique to support ongoing business reporting, data warehousing and BI solutions.
• Collaborate with Solution Architect to build processes and structure based on business & technical requirements to expand analytic intelligence capabilities.
• Work with Data Engineering counterparts to develop the data pipeline and metric definitions.
• Build various data visualizations to tell the story of trends, pattern and outliers.
• Work with Yield staff engineers to understand analytics use cases, determine data reporting and analysis requirements and translate business requirements into clear metrics.
• Support targeted ad hoc analysis methodologies to look for systematic low yield signal.
• Own the design and development of automated solutions for recurring reporting and in-depth analysis.
• Support continuous business performance optimization by performing deep-dive analysis to form actionable recommendation and presenting these recommendation to business leaders to drive decisions.
• Assist in data administration, modeling and integration activities in data warehouse solutions.
• Ensure data accuracy by validating data for new and existing tools.
• Optimize data pipeline & monitor system, creating dashboard for effective data integrity monitor.
Minimum Requirements:
Master in Computer Science, Manufacturing Engineering or related field
2 years’ experience as Data Engineer, Software Engineer or researcher
Knowledge of the following through education and/or work experience:
• SQL for storing and processing data
• Programming languages & libraries: Python and HTML/CSS
• Snowflake, AWS or other cloud-based data warehouse solution
• Data and database operation management: table query, store procedure, and data pipeline building
• Code version control with one or more tools: Github, Apache, AWS CodeCommit
• Building visualization with BI tools: Streamlit and Tableau
• Data analytics in Semiconductor manufacturing environment
• Data integrity solution: data validation & verification for ingestion pipeline
• Communication of technical concepts to mixed audiences of technical and business backgrounds
Software Powered by iCIMS
www.icims.com