Qualifications:
- Information Technology & Other, 5+ Years of experience, Degree or Honours (12+3 or equivalent).
- Degree in a relevant field such as Computer Science, Computational Mathematics, Computer Engineering, or Software Engineering.
- Specialization or electives in a Data & Analytics field (e.g. Data Warehousing, Data Science, Business Intelligence) a nice-to-have
- 5+ years of experience in Data Engineering (Fewer years? experience will be considered for Masters degree holders)
Experience:
- Expertise in Data Warehouse engineering including extraction, transformation, and data processing by developing/managing corresponding database tables.
- Expertise in database engineering and experience in SQL/PL-SQL. A key contributor to the organizational framework that standardizes the process of data collection, storage, transformation, distribution, and usage.
- Expertise in proactive performance monitoring, conducting data quality testing, and troubleshooting performance issues of complex warehouse data tables.
- Experience in assembling data from multiple sources, and analyzing and modeling complex datasets.
- Strong expertise in co-creating and refining key measurements for the business KPI(s), dashboard designing, implementing the right performance metrics, and usage of data visualization for effective implementation of business insights.
- Deep understanding of visual design standards and best practices related to BI & Reporting.
- Experience in building reusable components.
- Experience designing monitoring and alerting solutions for data pipelines and data repositories.
- Operates with efficiency and automation in mind; experience building reliable, re-usable automation frameworks (e.g., CI/CD)
- Airline industry experience is nice to have.
Knowledge/Skills:
- Strong ability to conduct data analysis (e.g., source system identification, data dictionary/metadata collection, data profiling, source-to-target mapping) is preferred.
- Operates with a “You Code It", "You Own It” mindset (i.e., supports the products they build)
- Demonstrated problem-solver; able to design and document solutions independently.
- Team player: able to collaborate with others to remove blockers, solve complex design problems, and debug/resolve issues.
- Able to deliver solutions (and associated value) interactively.
- Is accountable and displays a positive attitude.
- Self-starter and has a passion for exploring and learning new technologies, especially those in the Enterprise Data & Analytics space.
Key Technologies/Tools :
Big Data: Hadoop, Spark, Scala, Hive, H-Base, Sqoop, Oozie, Apache Nifi, Airflow, HDFS, ADLS (Gen 2), Azure Data Factory (ADF), DataBricks, Kafka, Elasticsearch, AVRO / PARQUET file formats Data Analysis, Modelling and Reporting: Snowflake, SQL, Data Vault 2.0, MicroStrategy, Power BI Cloud Technologies: Microsoft Azure and Cloudera technology stacks Integration and Messaging: Streaming (e.g. Spark Streaming), SnapLogic, TIBCO, Kafka, CI/CD: GIT Bitbucket, Azure DevOps, Jenkins, JIRA, Confluence Automation: Java, Selenium, AppDynamics, HP Load Runner, Jmeter, Python, Automation Anywhere
Leadership Role: No