Data Engineer at ENGIE Energy Access


ENGIE Energy Access is one of the leading Pay-As-You-Go (PAYGo) and mini-grids solutions provider in Africa, with a mission to deliver affordable, reliable and sustainable energy solutions and life-changing services with exceptional customer experience. The company is a result of the integration of Fenix International, ENGIEMobisol and ENGIEPowerCorner; and develops innovative, off-grid solar solutions for homes, public services and businesses, enabling customers and distribution partners access to clean, affordable energy.

ThePAYGosolar home systems are financed through affordable installments from $0.19per day and the mini-grids foster economic development by enabling electrical productive use and triggering business opportunities for entrepreneurs in rural communities. With over 1,700 employees, operations in 9 countries across Africa (Benin, Coted’Ivoire, Kenya, Mozambique, Nigeria, Rwanda, Tanzania, Uganda and Zambia), over 1.2 million customers and more than 6 million lives impacted so far, ENGIE Energy Access aims to remain the leading clean energy company, serving millions of customers across Africa by 2025.

Job Position: Data Engineer

Job Location: Nigeria

Job Grade: HL15

Job Description

  1. This position will be part of the Global Data team that is based across Germany, Uganda, Kenya, and Nigeria.
  2. You will report to the Head of Data, and work closely with data scientists, devops team, and software engineers.
  3. This is an incredible opportunity for a talented individual to join a high-performing team that is passionate about pioneering expanded financial services to off-grid customers at the base of the pyramid.
  4. Key responsibilities will include building, maintaining, and ensuring scalability of data pipelines between MySQL databases which service our in-house applications, IoT data delivered from devices, PBX, our in-house ticketing system, and our data lake used for analytics across the company.
  5. You would also be responsible for building and optimizing pipelines to deliver data in real-time to our field team mobile application to allow data-informed decisions to be made in the field, as well as working with members of the data team to ensure high code quality and database design.
  6. Your work will make a meaningful impact by enabling EEA to continuously innovate on how we support our customers in their solar kit experience and repayment journey.

Job Responsibilities

  1. Work with data and software teams to design, build, and support critical data pipelines using airflow for modeling, ETL, and analytics.
  2. Optimize data storage between Redshift, s3, and other storage solutions to support data analytics, modeling, archiving and data integrity.
  3. Develop logging and visualization KPI Dashboards with Grafana or another tool to score the efficiency of business processes.
  4. Containerize models with Docker and Kubernetes to serve real-time financial information to field teams.
  5. Work with software engineers and devops team to optimize performance of in-house applications which communicate data through APIs and other means.
  6. Maintain and Develop Tools for unit testing, streamlined ETL, and other processes for the Data Team.
  7. Mentor data scientists and analysts on best coding practices through code reviews and discussions.

Job Requirements


  • Degree in Computer Science or related field


  1. 3+ years of industry experience
  2. Experience building infrastructure to support streaming or offline data
  3. Extensive programming experience in Python/Scala/Java
  4. Experience with SQL in addition to one or more of Spark/Hadoop/Hive/HDFS
  5. Experience with implementing unit and integration testing
  6. Ability to gather requirements and communicate with stakeholders across data, software, and platform teams
  7. Ability to develop a strategic vision for data pipelining and infrastructure
  8. AWS Certification is a plus
  9. Strong communication across data, devops, and software team members
  10. Sense of adventure and willingness to dive in, think big, and execute with a team


  1. English
  2. French, Portuguese, or German is a plus


  1. Linux-based systems
  2. Knowledge of Amazon Web Services (AWS) and its services, such as, but not limited to, Cloudwatch, RDS, Redshift, Lambda, EMR, S3, SQS, EC2
  3. Python, jupyter notebooks
  4. Airflow (or other workflow management systems, such as Luigi)
  5. Docker, Kubernetes or other container like tools
  6. Streaming tools such as Kafka, Kinesis
  7. Knowledge of Hetzner is a plus

How to Apply
Interested and qualified candidates should:
Click here to apply online


Leave a Comment

Your email address will not be published. Required fields are marked *