Cloud Engineer

Iga Koch
osoba kontaktowa
Iga Koch
CZE 11
Dolnośląskie na stałe Inżynier/Licencjat
11.07.2021 108146102


Our client: A privately-owned data solutions consultancy enabling clients to Organise, Analyse, and Commercialise their business data. Established in Edinburgh in 2010 the company has an enviable customer list including leading organisations throughout UK and Australia. Forecast is establishing an Engineering Centre of Excellence in Wroclaw with a multi-disciplinary team of data engineers, data analysts, and data scientists.

Permanent full-time position based in Wrocław

Highly attractive remuneration package

Challenging position in a small and growing team

Occasional travel (post-COVID)

Zakres obowiązków

Create and maintain optimal data pipeline architecture

Assemble large, complex data sets that meet functional / non-functional business requirements

Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability

Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and AWS ‘big data’ technologies.

Build analytics tools that utilise the data pipeline to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics

Work with stakeholders to assist with data-related technical issues and support their data infrastructure needs

Keep our data separated and secure across geographic boundaries through multiple isolation models

Create data tools for analytics and data scientist team members that assist them in building and optimising our product into an innovative industry leader.

Work with our data science team and cloud architects to develop greater functionality in our data systems


Undergraduate degree in computer science, mathematics or similar

Cloud Certification: AWS or GCP or Azure

3+ years of experience with Python, Scala/Java (or similar), SQL, and data visualisation/exploration tools

Solution Experience: ETL, data warehousing, relational databases, NoSQL, streaming technologies, APIs and Microservices, workflow/schedulers

Data modelling experience (conceptual, logical, physical)

Working knowledge of structured, semi-structured and unstructured data

Must be competent in data security and compliance

You should also have experience using the following software/tools:

-         Hadoop, Spark, Kafka, etc.

-         Relational SQL and NoSQL databases, including Postgres and Cassandra

-         Experience with data pipeline and workflow management tools (Azkaban, Luigi, Airflow, etc.)

-         AWS specific data technologies (EC2, EMR, RDS, Redshift)

-         Stream-processing systems (Storm, Spark, Flink, etc.)

English&Polish proficiency essential

Current job is not marked as favorite, click to favorite this job