Data Engineer / Architect
This assignment expired 1 year ago

Data Engineer / Architect



What makes Cognizant a unique place to work? The combination of rapid growth and an international and innovative environment! This is creating a lot of opportunities for people like YOU — people with an entrepreneurial spirit who want to make a difference in this world. At Cognizant, together with your colleagues globally, you will collaborate on crafting solutions for prestigious companies and help them become more flexible, innovative and successful. And this is your chance to join the success story: we are looking for a Data Engineer/Architect to join our Data+ Team.

About the role:

As Data Engineer/Architect you help clients transform their business, mostly driven by digital opportunities and technology. In a typical program you are working to evaluate, test and select new Data tools and technologies to solve the business and IT needs. Your communication skills are good enough to explain the technical concepts to non-technical audience. You are good in DWH, ETL/ELT concepts and building data pipelines. Using Cloud and especially Azure is your daily routine.

You will be working in a fast-paced Agile environment and expected to facilitate, guide, and influence the clients and teams towards right information technology architecture and becoming interface between Business leadership, Tech leadership and the delivery teams. You would be expected to complex design and high-performance architecture in Azure and DWH technologies


• Snowflake data modelling (3NF & Dimensional)

• Excellent verbal and written communication skills, including the ability to communicate technical concepts to non-technical people

• Understanding ETL concepts & Architecture

• Evaluate, test and select new technologies that complement current stack and solve existing problems.

• Infrastructure creation and maintenance, writing deployment scripts.

• Keeping infrastructure secure and up to date, regular meetings with security team.

• Set and improve general data product development process, setup integration for CI/CD, automated testing pipelines.

• Managing user access to Snowflake, GitHub Organization, Prefect, dbt Cloud.

• Set standards for data modelling, help teams to build optimized warehouse and data products.

• DevOps process setup and improvements.

• Main data pipeline and flow design and deployment – setting up dbt jobs that would optimal and reliable according to business needs.

• Keywords and technologies: Snowflake, dbt, Python, PySpark, Azure Key Vault, Prefect, git, gitbash, GitHub Actions, Bicep, IAC

What you can expect:

• Become part of the flag ship’s success story - We go through enormous growth!

• An organization driven by technology - We have a tremendous technology backbone

• Open, ‘can do’ team spirit • An environment where you can make your own ideas a reality

• Drive your own career • Attractive salary (4229-6690 eur/month gross) depending on competency level

• Competitive benefits package

• Scandinavian ways of working

• Opportunity to grow both professionally and personally (incl. Udemy)

Key skills required

ETL (Extract, Transform, Load)

Required Skills

ETL (Extract, Transform, Load) 3-4 years
Spark 2-3 years
CI/CD 2-3 years
Python 2-3 years
Git 2-3 years
Azure 2-3 years

About the assignment

Atrašanās vieta
Vilnius, Lietuva
Rate (after tax)
€2500 - 4000/mēnesī
Pilna laika darbs

Want to apply to this project? Register or simply drop CV & Apply