IT Data Technical leader
Taichung, TW

Requisition Number : 54904

Corning is one of the world’s leading innovators in materials science. For more than 160 years, Corning has applied its unparalleled expertise in specialty glass, ceramics, and optical physics to develop products that have created new industries and transformed people’s lives.

Corning succeeds through sustained investment in R&D, a unique combination of material and process innovation, and close collaboration with customers to solve tough technology challenges.

Corning’s Display Technologies segment manufactures glass substrates for active matrix liquid crystal displays ( LCDs ) that are used primarily in LCD televisions, notebook computers and flat panel desktop monitors.

Role Summary :

Data and technologies have drastically transformed the processes of industrial manufacturers from heavily human-powered to efficiently automated and system driven.

Corning is creating innovative digital solutions for business divisions and functions and are looking for passionate, hard-working, and talented Data Technical Leader to help build our data-driven investment products for reuse, velocity and scale.

The Data Technical Leader will work with application development and IT operations to deploy data solutions using modern data / analytics technologies.

Their primary responsibility will be to develop productized, reliable and instrumented data ingestion pipelines that land inbound data from multiple process and operational data stores throughout the company to on-premise and cloud-based data lakes.

These pipelines will require data validation and data profiling automation along with version control and CI / CD to ensure ongoing resiliency and maintainability of the inbound data flows supporting our data and analytics projects.

This position will involve both hands-on data pipeline and engineering development and testing.

Essential Responsibilities :

  • Designing and implementing highly performant data ingestion pipelines from structured and unstructured sources using both batch and replication capabilities such as Qlik Data Integration, AWS DMS, Databricks and process the data using S3, Apache Spark, python, Deltalake and other relevant tech stacks
  • Implement patterns of practice for productized, CI / CD automated and highly performant data transformations and aggregations with an emphasis on integrated data models that support broad based KPIs
  • Work closely with business to identify, priorities and deliver on their most pressing and impactful analytical use case;
  • turning complex findings into simple visualizations and recommendations for execution

  • Delivering and presenting proofs of concept to of key technology components to project stakeholders.
  • Providing operational support documentation and operational procedures as well as resolving data pipeline operational issues to stated SLAs.
  • Integrating the end to end data pipeline to take data from source systems to target data repositories ensuring the quality and consistency of data is always maintained
  • Working with other members of the project team to support delivery of additional project components
  • Evaluating the performance and applicability of multiple data ingestion approaches against customer requirements
  • Working within an Agile delivery / DevOps methodology to deliver proof of concept and production implementation in iterative sprints.
  • Qualifications / Requirements :

  • Bachelor's Degree. Information Systems, Information Technology (IT), Computer Science, or Engineering preferred
  • 5+ years relevant IT experience in data analytics roles
  • 4+ years of experience in big data engineering roles, developing and maintaining ETL and ELT pipelines for data warehousing, on-premise and cloud datalake environments
  • 3+ years of hands-on technical familiarity with Python, SSIS
  • Experience in SQL and NoSQL
  • Experience in AWS data platforms, including AWS Data Migration Services (DMS), AWS Data Pipeline and Data Pipeline
  • Preferred experience in agile software development and continuous integration + continuous deployment methodologies along with supporting tools such as Git (Gitlab), Jira, Terraform
  • General Qualifications :

  • Able to conceptualize, plan, document and communicate effectively
  • Proven success in communicating with users, other technical teams, and senior management to collect requirements, describe data modeling decisions and data engineering strategy
  • Fluent English & Chinese communication in speaking, writing and presentation
  • Experience providing in leading other engineers for best practices on data engineering
  • Knowledge of software engineering best practices across the development lifecycle, including agile methodologies, coding standards, code reviews, source management, build processes, testing, and operations
  • Understanding of manufacturing business data domains (e.g. Manufacturing, Engineering and Supply-chain)
  • Solves complex problems with effective solutions using rigorous logic and methods
  • Highly self-motivated and directed
  • Team-oriented, collaborative environment leadership skills
  • Effective planning skills, with ability to handle changing priorities
  • Ability to work with international teams across multiple initiatives that span various working group, geographic borders, time zones, and cultures
  • Must be a self-starter, able to work in team environment and motivate others
  • Good interpersonal and influencing skills Travel : Limited work travel
  • 报告这项工作

    Thank you for reporting this job!

    Your feedback will help us improve the quality of our services.

    通過點擊“繼續”,我允許neuvoo同意處理我的數據並向我發送電子郵件提醒,詳見neuvoo的 隱私政策 。我可以隨時撤回我的同意或退訂。