The Data Engineer will play a critical role in facilitating the use of data across the business. Working closely with the Head of Data and another data engineer, the data engineer will be responsible for building automated and robust data pipelines that align with wider business initiatives.
The Company is in a development and growth phase and relies on the flexibility and innovative skills of its employees to ensure success. The work environment will reflect this dynamic period while maintaining high corporate standards. The company has a superb open plan office in central Edinburgh and offers and fun but hardworking and focused working environment reflective of the flexible and informal culture of the company.
Flexitricity operates in real-time energy markets, delivering flexible electricity capacity at optimised prices to National Grid, distribution networks and other market participants. Our job is to earn the best return for our customers using the flexible capacity they have available at the time.
About the role
The Data Engineer will play a critical role in facilitating the use of data across the business. You’ll be looking to simplify and scale our organisation by developing innovative data driven solutions through data pipelines, data modelling and ETL/ELT design. The role will also play a central part in the development of our machine learning and reporting architecture.
Working closely with the Head of Data and another data engineer, the data engineer will be responsible for building automated and robust data pipelines that align with wider business initiatives.
- Build and maintain data pipelines (including those from 3rd party sources and APIs).
- Implement infrastructure for data science and analytics, monitoring applied models and fine-tuning algorithm calculations (with support from data scientists).
- Identifying and patching issues and bugs identified in the pipeline/architecture.
- Ensuring reliability of infrastructure, and provisioned data.
- Working as part of a centralised team to deliver quality service to internal customers.
- Communicating with team to define data requirements to support business issues / queries, including collecting, analysing, interpreting, and translating the result to non-technical stakeholders.
Required skills and qualifications
- High-quality degree in a technical discipline or equivalent experience.
- Experience working with Azure.
- Good Azure Data Factory knowledge.
- Strong experience using Python for data manipulation.
- Experience with data lakes and parquet storage.
- Experienced with SQL based databases (Postgres, MS-SQL).
- Experience of building data pipelines with machine learning outcomes in mind.
- Experience of building data pipelines with reporting and visualisation outcomes in mind.
- Using git for project version control.
- Experience using Azure Functions.
- Experience with distributed data management (i.e. PySpark, Dask, etc)
- Developing, and maintaining CI/CD processes.
- Data warehousing and data modelling experience.
- Experience with Databricks.
- Experience with streaming data.
- Proactive and highly reliable.
- A team player, willing to get involved in tasks across the business.
- Lateral and creative thinker.
- Balances attention to detail with commercial focus customer service-orientated approach.
- Able to perform effectively under pressure with good personal organisation, time management and prioritisation skills.
We are committed to rewarding initiative and excellence, and we offer a competitive salary with comprehensive benefits. You can expect challenge, learning and autonomy in creating value in our dynamic industry-leading business which is playing a critical role in the transition to a zero-carbon future.
Interested candidates should apply with a CV to email@example.com.
Get in touch today
Book a call with one of our energy market specialists to find out if you can participate and how much your site could earn.0131 221 8100