Requisition Number : 57013
Corning is one of the world’s leading innovators in materials science. For more than 160 years, Corning has applied its unparalleled expertise in specialty glass, ceramics, and optical physics to develop products that have created new industries and transformed people’s lives.
Corning succeeds through sustained investment in R&D, a unique combination of material and process innovation, and close collaboration with customers to solve tough technology challenges.
Corning’s Display Technologies segment manufactures glass substrates for active matrix liquid crystal displays ( LCDs ) that are used primarily in LCD televisions, notebook computers and flat panel desktop monitors.
Role Summary :
Data and technologies have drastically transformed the processes of industrial manufacturers from heavily human-powered to efficiently automated and system driven.
Corning is creating innovative digital solutions for business divisions and functions and are looking for passionate, hard-working, and talented Data Technical Leader to help build our data-driven investment products for reuse, velocity and scale.
The Data Technical Leader will work with application development and IT operations to deploy data solutions using modern data / analytics technologies.
Their primary responsibility will be to develop productized, reliable and instrumented data ingestion pipelines that land inbound data from multiple process and operational data stores throughout the company to on-premise and cloud-based data lakes.
These pipelines will require data validation and data profiling automation along with version control and CI / CD to ensure ongoing resiliency and maintainability of the inbound data flows supporting our data and analytics projects.
This position will involve both hands-on data pipeline and engineering development and testing.
Essential Responsibilities :
Designing and implementing highly performant data ingestion pipelines from structured sources using both batch and replication capabilities such as Qlik Data Integration, AWS DMS, Databricks and process the data using S3, Apache Spark, python, Deltalake and other relevant tech stacks
Implement patterns of practice for productized, CI / CD automated and highly performant data transformations and aggregations with an emphasis on integrated data models that support broad based KPIs
Work closely with business to identify, priorities and deliver on their most pressing and impactful analytical use case, analyzing and translating business needs into long-term solution data models, turning complex findings into simple visualizations and recommendations for execution
Delivering and presenting proofs of concept to of key technology components to project stakeholders.
Integrating the end-to-end data pipeline to take data from source systems to target data repositories ensuring the quality and consistency of data is always maintained
Working with other members of the project team to support delivery of additional project components
Evaluating the performance and applicability of multiple data ingestion approaches against customer requirements
Working within an Agile delivery / DevOps methodology to deliver proof of concept and production implementation in iterative sprints.
Qualifications / Requirements :
Bachelor's Degree. Information Systems, Information Technology (IT), Computer Science, or Engineering preferred
5+ years relevant IT experience in data analytics roles
4+ years of experience in data engineering roles, developing, and maintaining data pipelines for data warehousing in on-premises or cloud datalake environments.
3+ years of hands-on experience with physical and relational data modeling
3+ years of hands-on technical familiarity with Python and Spark / Spark streaming, experience with DataBricks on AWS platform is a plus
3 years of hands-on experience with physical and relational data modeling
Good knowledge of ERP Supply Chain management data domains, familiarity with SAP data (PP, MM, SD) is a plus
Experience in agile software development and continuous integration + continuous deployment methodologies along with supporting tools such as Git (Gitlab), Jira, Terraform