Data Engineering Lead

About the Role

Big Data is dramatically changing our lives and it is throwing up multiple business opportunities. Centelon is at the forefront of this global transformation.  We are working on some of the path-breaking problems such as cloud-based data warehouse, business intelligence, data lake, and data migration.

We are looking for action-oriented and motivated individuals who have an obsession with engineering excellence and customer satisfaction.

An ideal candidate is expected to have a flair for fast learning and coding. This position offers a steep learning curve, with significant exposure of working directly with global clients. You are expected to be one of the best in the industry in applied big data.

 

Expected Skills:

·         Overall experience of over 8+ years in big data space in leading projects or units of work.

·         Experience in architecting, designing, and delivering data pipelining solutions in cloud technologies such as Azure Synapse, AWS EMR, Databricks or Snowflake. Certifications are a plus.

·         Project experience in Data Warehousing, Business Intelligence, or Data Analytics.

·         Solid foundation in data warehouse and BI design, tools, processes, and implementation approaches.

·         Experience in building schedule-driven workflows, serialization formats, data modeling, and architecting for performance.

·         Hands-on experience using Pyspark data frames and Pyspark SQL.

·         Data modeling skills using tools such as Erwin.

·         Exposure to specific ETL/ELT tools such as Azure Data Factory and AWS Glue, and query engines such as AWS Athena.

·         Any exposure to visualization tools would be an added advantage.