Welcome to the future of energyWe’re a leading energy technology company providing a better experience for our customers through transparency, honesty and simplicity. Better for the planet, through real long-term investment in renewable generation and a low CO2 future and better for our customer’s wallets with fair and affordable plans. Through our proprietary platform and custom-built stack, cloud-based billing, and sophisticated use of data science, we’re revolutionizing what’s possible in energy. We’re transforming the way people interact with their energy company, by making it approachable, easy-to-understand, and most importantly, 100% renewable. We’ve distinguished ourselves by being named 2020’s Energy Provider of the Year, which highlights our commitment to exceptional customer service. We’ve just hopped across the pond in the US and are making big strides in Texas...but that’s only the beginning. Come join our rapidly growing team as we continue to break barriers in energy.
At Octopus we’re developing a data platform that provides data services to all our retail energy businesses around the world. The platform enables self-service of data analytics to hundreds of data hungry users as well as automation of all our data workflows from simple ETL jobs to ML training and prediction. The data platform team works across the whole customer domain on anything from marketing campaign optimisation to helping our traders buy the right amount of energy.Octopus Energy is growing fast and that means lots of data that needs to be ingested, organised, analysed and shared with the team. We’re looking for a data engineer that can help us with this challenge. You’ll work across all different parts of the business to understand what our team’s needs and deliver data pipelines and tools to meet them. Because it’s still early days, you’ll need to be versatile and be equally comfortable building robust production ready pipelines or hacking together a quick script to run on your machine. You’ll spend most of your time engineering but you should also enjoy analysing data and building data interfaces like Tableau dashboards or data applications. You’ll be part of our global data platform team who will provide dev ops and infrastructure support and technical guidance. We’re building a consistent data platform across all Octopus retail businesses in UK, US, Germany, Spain, Japan and NZ so you’ll be part of and contribute to a global data community.
What you'll do
- Work with the data scientists and data analysts to scope out and plan new data sources and pipelines
- Build, automate, deploy and maintain data pipelines
- Build Streamlet data apps and lend a hand building and maintaining Tableau dashboards
- Work with the global data platform team to deploy new tools and services into the US data environment
- Participate in and contribute to our global data community
- Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources
- Use an analytical, data-driven approach to drive a deep understanding of fast changing business
- Build large-scale batch and real-time data pipelines with data processing frameworks in Amazon Web Services, Azure or GCP cloud platform
What you'll have
- First and foremost, we want our data engineers to be great software engineers with a passion for writing high quality code
- It would be helpful to have experience/expertise in the following (in rough priority order):
- Python (in combination with Data Pipelines and Analytics)
- Advanced SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases.
- Experience modelling data for analytics - ideally experience using dbt as a modelling tool
- Experience building data pipelines in a cloud environment (ideally AWS)
- The projects will be varied and we’re looking for someone who can work autonomously and proactively to scope problems and solve and deliver pragmatic solutions
- We want someone who is passionate about building great data tools for our business teams
Our data platform stack
If this all sounds like you then we'd love to hear from you. At Octopus, we're looking for genuinely decent people who are honest and empathetic. Our commitment is to provide equal opportunities, a diverse and inclusive work environment, and fairness for everyone. You are welcome to apply no matter your race, gender identity, sexuality, age, family or civil status, disability, religion, or ethnicity.
- Python as our main programming/scripting language
- Kubernetes for data services and task orchestration
- Airflow purely for job scheduling and tracking
- Circle CI for continuous deployment
- Parquet and Databricks Delta file formats on S3 for data lake storage
- Spark and pandas for data processing
- dbt for data modelling
- Presto and SparkSQL for querying
- Streamlet for data applications
- Tableau for BI