Lista ofert pracy
Data Engineer (ETL, Azure) Miejsce pracy: Wrocław Technologies we use Expected Python SQL AWS Microsoft Azure CI/CD Optional Azure Databrick Data Lake Data Factory Glue Synapse Snowflake AWS Redshift About the project Join Capgemini and help shape the future of healthcare data. As a Data Engineer, you’ll design and optimize cloud-based data solutions that power critical insights and improve patient outcomes. Your work will directly contribute to projects that make a real difference. Join Our... (więcej)
- Design, implement, and maintain data processing solutions in AWS and Azure environments.
- Build and optimize ETL pipelines and data integration workflows.
- Develop and maintain data pipelines using Azure Data Factory, Synapse, and Databricks.
- Ensure data quality, security, and compliance standards across all solutions.
- Collaborate with analysts, business stakeholders, and cross-functional teams to deliver reliable data solutions.
- Support data architecture improvements and cloud ...
- 3-6 years of experience in data engineering (ETL, integration, modeling).
- Strong knowledge of AWS or Azure, or both services (Azure Databrick, Data Lake, Data Factory, Glue, Synapse, Snowflake, AWS Redshift).
- Proficiency in Python and SQL.
- Familiarity with CI/CD tools and basic DevOps concepts.
- Understanding of data integration and data quality principles.
- English at B2 level or higher.
- Healthcare or fintech/pharma project experience is a plus.
- Practical benefits: private medical care with Medicover with additional packages (e.g., dental, senior care, oncology) available on preferential terms, life insurance, and 40 options on our NAIS benefit platform, including Netflix, Spotify, or Sports card.
- Access to over 70 training tracks with certification opportunities (e.g., GenAI, Excel, Business Analysis, Project Management) on our NEXT platform.
- Free access to Education First languages platform, TED Talks, and Udemy Business ...
Data Engineer (ETL, Azure) Miejsce pracy: Warszawa Technologies we use Expected Python SQL AWS Microsoft Azure CI/CD Optional Azure Databrick Data Lake Data Factory Glue Synapse Snowflake AWS Redshift About the project Join Capgemini and help shape the future of healthcare data. As a Data Engineer, you’ll design and optimize cloud-based data solutions that power critical insights and improve patient outcomes. Your work will directly contribute to projects that make a real difference. Join Our... (więcej)
- Design, implement, and maintain data processing solutions in AWS and Azure environments.
- Build and optimize ETL pipelines and data integration workflows.
- Develop and maintain data pipelines using Azure Data Factory, Synapse, and Databricks.
- Ensure data quality, security, and compliance standards across all solutions.
- Collaborate with analysts, business stakeholders, and cross-functional teams to deliver reliable data solutions.
- Support data architecture improvements and cloud ...
- 3-6 years of experience in data engineering (ETL, integration, modeling).
- Strong knowledge of AWS or Azure, or both services (Azure Databrick, Data Lake, Data Factory, Glue, Synapse, Snowflake, AWS Redshift).
- Proficiency in Python and SQL.
- Familiarity with CI/CD tools and basic DevOps concepts.
- Understanding of data integration and data quality principles.
- English at B2 level or higher.
- Healthcare or fintech/pharma project experience is a plus.
- Practical benefits: private medical care with Medicover with additional packages (e.g., dental, senior care, oncology) available on preferential terms, life insurance, and 40 options on our NAIS benefit platform, including Netflix, Spotify, or Sports card.
- Access to over 70 training tracks with certification opportunities (e.g., GenAI, Excel, Business Analysis, Project Management) on our NEXT platform.
- Free access to Education First languages platform, TED Talks, and Udemy Business ...
Data Engineer (ETL, Azure) Miejsce pracy: Poznań Technologies we use Expected Python SQL AWS Microsoft Azure CI/CD Optional Azure Databrick Data Lake Data Factory Glue Synapse Snowflake AWS Redshift About the project Join Capgemini and help shape the future of healthcare data. As a Data Engineer, you’ll design and optimize cloud-based data solutions that power critical insights and improve patient outcomes. Your work will directly contribute to projects that make a real difference. Join Our... (więcej)
- Design, implement, and maintain data processing solutions in AWS and Azure environments.
- Build and optimize ETL pipelines and data integration workflows.
- Develop and maintain data pipelines using Azure Data Factory, Synapse, and Databricks.
- Ensure data quality, security, and compliance standards across all solutions.
- Collaborate with analysts, business stakeholders, and cross-functional teams to deliver reliable data solutions.
- Support data architecture improvements and cloud ...
- 3-6 years of experience in data engineering (ETL, integration, modeling).
- Strong knowledge of AWS or Azure, or both services (Azure Databrick, Data Lake, Data Factory, Glue, Synapse, Snowflake, AWS Redshift).
- Proficiency in Python and SQL.
- Familiarity with CI/CD tools and basic DevOps concepts.
- Understanding of data integration and data quality principles.
- English at B2 level or higher.
- Healthcare or fintech/pharma project experience is a plus.
- Practical benefits: private medical care with Medicover with additional packages (e.g., dental, senior care, oncology) available on preferential terms, life insurance, and 40 options on our NAIS benefit platform, including Netflix, Spotify, or Sports card.
- Access to over 70 training tracks with certification opportunities (e.g., GenAI, Excel, Business Analysis, Project Management) on our NEXT platform.
- Free access to Education First languages platform, TED Talks, and Udemy Business ...
Data Engineer (ETL, Azure) Miejsce pracy: Opole Technologies we use Expected Python SQL AWS Microsoft Azure CI/CD Optional Azure Databrick Data Lake Data Factory Glue Synapse Snowflake AWS Redshift About the project Join Capgemini and help shape the future of healthcare data. As a Data Engineer, you’ll design and optimize cloud-based data solutions that power critical insights and improve patient outcomes. Your work will directly contribute to projects that make a real difference. Join Our... (więcej)
- Design, implement, and maintain data processing solutions in AWS and Azure environments.
- Build and optimize ETL pipelines and data integration workflows.
- Develop and maintain data pipelines using Azure Data Factory, Synapse, and Databricks.
- Ensure data quality, security, and compliance standards across all solutions.
- Collaborate with analysts, business stakeholders, and cross-functional teams to deliver reliable data solutions.
- Support data architecture improvements and cloud ...
- 3-6 years of experience in data engineering (ETL, integration, modeling).
- Strong knowledge of AWS or Azure, or both services (Azure Databrick, Data Lake, Data Factory, Glue, Synapse, Snowflake, AWS Redshift).
- Proficiency in Python and SQL.
- Familiarity with CI/CD tools and basic DevOps concepts.
- Understanding of data integration and data quality principles.
- English at B2 level or higher.
- Healthcare or fintech/pharma project experience is a plus.
- Practical benefits: private medical care with Medicover with additional packages (e.g., dental, senior care, oncology) available on preferential terms, life insurance, and 40 options on our NAIS benefit platform, including Netflix, Spotify, or Sports card.
- Access to over 70 training tracks with certification opportunities (e.g., GenAI, Excel, Business Analysis, Project Management) on our NEXT platform.
- Free access to Education First languages platform, TED Talks, and Udemy Business ...
Data Engineer (ETL, Azure) Miejsce pracy: Lublin Technologies we use Expected Python SQL AWS Microsoft Azure CI/CD Optional Azure Databrick Data Lake Data Factory Glue Synapse Snowflake AWS Redshift About the project Join Capgemini and help shape the future of healthcare data. As a Data Engineer, you’ll design and optimize cloud-based data solutions that power critical insights and improve patient outcomes. Your work will directly contribute to projects that make a real difference. Join Our... (więcej)
- Design, implement, and maintain data processing solutions in AWS and Azure environments.
- Build and optimize ETL pipelines and data integration workflows.
- Develop and maintain data pipelines using Azure Data Factory, Synapse, and Databricks.
- Ensure data quality, security, and compliance standards across all solutions.
- Collaborate with analysts, business stakeholders, and cross-functional teams to deliver reliable data solutions.
- Support data architecture improvements and cloud ...
- 3-6 years of experience in data engineering (ETL, integration, modeling).
- Strong knowledge of AWS or Azure, or both services (Azure Databrick, Data Lake, Data Factory, Glue, Synapse, Snowflake, AWS Redshift).
- Proficiency in Python and SQL.
- Familiarity with CI/CD tools and basic DevOps concepts.
- Understanding of data integration and data quality principles.
- English at B2 level or higher.
- Healthcare or fintech/pharma project experience is a plus.
- Practical benefits: private medical care with Medicover with additional packages (e.g., dental, senior care, oncology) available on preferential terms, life insurance, and 40 options on our NAIS benefit platform, including Netflix, Spotify, or Sports card.
- Access to over 70 training tracks with certification opportunities (e.g., GenAI, Excel, Business Analysis, Project Management) on our NEXT platform.
- Free access to Education First languages platform, TED Talks, and Udemy Business ...
Data Engineer (ETL, Azure) Miejsce pracy: Kraków Technologies we use Expected Python SQL AWS Microsoft Azure CI/CD Optional Azure Databrick Data Lake Data Factory Glue Synapse Snowflake AWS Redshift About the project Join Capgemini and help shape the future of healthcare data. As a Data Engineer, you’ll design and optimize cloud-based data solutions that power critical insights and improve patient outcomes. Your work will directly contribute to projects that make a real difference. Join Our... (więcej)
- Design, implement, and maintain data processing solutions in AWS and Azure environments.
- Build and optimize ETL pipelines and data integration workflows.
- Develop and maintain data pipelines using Azure Data Factory, Synapse, and Databricks.
- Ensure data quality, security, and compliance standards across all solutions.
- Collaborate with analysts, business stakeholders, and cross-functional teams to deliver reliable data solutions.
- Support data architecture improvements and cloud ...
- 3-6 years of experience in data engineering (ETL, integration, modeling).
- Strong knowledge of AWS or Azure, or both services (Azure Databrick, Data Lake, Data Factory, Glue, Synapse, Snowflake, AWS Redshift).
- Proficiency in Python and SQL.
- Familiarity with CI/CD tools and basic DevOps concepts.
- Understanding of data integration and data quality principles.
- English at B2 level or higher.
- Healthcare or fintech/pharma project experience is a plus.
- Practical benefits: private medical care with Medicover with additional packages (e.g., dental, senior care, oncology) available on preferential terms, life insurance, and 40 options on our NAIS benefit platform, including Netflix, Spotify, or Sports card.
- Access to over 70 training tracks with certification opportunities (e.g., GenAI, Excel, Business Analysis, Project Management) on our NEXT platform.
- Free access to Education First languages platform, TED Talks, and Udemy Business ...
Data Engineer (ETL, Azure) Miejsce pracy: Katowice Technologies we use Expected Python SQL AWS Microsoft Azure CI/CD Optional Azure Databrick Data Lake Data Factory Glue Synapse Snowflake AWS Redshift About the project Join Capgemini and help shape the future of healthcare data. As a Data Engineer, you’ll design and optimize cloud-based data solutions that power critical insights and improve patient outcomes. Your work will directly contribute to projects that make a real difference. Join Our... (więcej)
- Design, implement, and maintain data processing solutions in AWS and Azure environments.
- Build and optimize ETL pipelines and data integration workflows.
- Develop and maintain data pipelines using Azure Data Factory, Synapse, and Databricks.
- Ensure data quality, security, and compliance standards across all solutions.
- Collaborate with analysts, business stakeholders, and cross-functional teams to deliver reliable data solutions.
- Support data architecture improvements and cloud ...
- 3-6 years of experience in data engineering (ETL, integration, modeling).
- Strong knowledge of AWS or Azure, or both services (Azure Databrick, Data Lake, Data Factory, Glue, Synapse, Snowflake, AWS Redshift).
- Proficiency in Python and SQL.
- Familiarity with CI/CD tools and basic DevOps concepts.
- Understanding of data integration and data quality principles.
- English at B2 level or higher.
- Healthcare or fintech/pharma project experience is a plus.
- Practical benefits: private medical care with Medicover with additional packages (e.g., dental, senior care, oncology) available on preferential terms, life insurance, and 40 options on our NAIS benefit platform, including Netflix, Spotify, or Sports card.
- Access to over 70 training tracks with certification opportunities (e.g., GenAI, Excel, Business Analysis, Project Management) on our NEXT platform.
- Free access to Education First languages platform, TED Talks, and Udemy Business ...
Data Engineer (ETL, Azure) Miejsce pracy: Gdańsk Technologies we use Expected Python SQL AWS Microsoft Azure CI/CD Optional Azure Databrick Data Lake Data Factory Glue Synapse Snowflake AWS Redshift About the project Join Capgemini and help shape the future of healthcare data. As a Data Engineer, you’ll design and optimize cloud-based data solutions that power critical insights and improve patient outcomes. Your work will directly contribute to projects that make a real difference. Join Our... (więcej)
- Design, implement, and maintain data processing solutions in AWS and Azure environments.
- Build and optimize ETL pipelines and data integration workflows.
- Develop and maintain data pipelines using Azure Data Factory, Synapse, and Databricks.
- Ensure data quality, security, and compliance standards across all solutions.
- Collaborate with analysts, business stakeholders, and cross-functional teams to deliver reliable data solutions.
- Support data architecture improvements and cloud ...
- 3-6 years of experience in data engineering (ETL, integration, modeling).
- Strong knowledge of AWS or Azure, or both services (Azure Databrick, Data Lake, Data Factory, Glue, Synapse, Snowflake, AWS Redshift).
- Proficiency in Python and SQL.
- Familiarity with CI/CD tools and basic DevOps concepts.
- Understanding of data integration and data quality principles.
- English at B2 level or higher.
- Healthcare or fintech/pharma project experience is a plus.
- Practical benefits: private medical care with Medicover with additional packages (e.g., dental, senior care, oncology) available on preferential terms, life insurance, and 40 options on our NAIS benefit platform, including Netflix, Spotify, or Sports card.
- Access to over 70 training tracks with certification opportunities (e.g., GenAI, Excel, Business Analysis, Project Management) on our NEXT platform.
- Free access to Education First languages platform, TED Talks, and Udemy Business ...
Principal Software Engineer (.Net / MS Azure / AI) Miejsce pracy: Warszawa Technologie, których używamy Wymagane .NET DevOps Angular React Native Microsoft Azure AI System operacyjny Windows macOS O projekcie Projekt międzynarodowy, branża podatkowa. To stanowisko oferuje możliwość zdefiniowania na nowo sposobu budowania oprogramowania poprzez integrację asystentów AI i multi-agent workflows - w cyklu życia oprogramowania. Proces rekrutacyjny (jedna rozmowa screeningowa trzy spotkania... (więcej)
Systems Integration Engineer (SysOps)
42 dni temu przez Ośrodek Badawczo - Rozwojowy Centrum Techniki Morskiej S.A. w Gdyni
Systems Integration Engineer (SysOps) Miejsce pracy: Gdynia Technologie, których używamy Wymagane Bash Python Ansible VMware vSphere Wireshark tcpdump System operacyjny Windows Linux Twój zakres obowiązków Instalacja i konfiguracja systemów Linux oraz Windows Zarządzanie użytkownikami poprzez AD Tworzenia skryptów bash, python Konfiguracja urządzeń sieciowych Utrzymywanie środowiska laboratyjnego Wsparcie dla klienta w ramach prowadzonych projektów Uczestnictwo w próbach FAT, HAT,... (więcej)
- Instalacja i konfiguracja systemów Linux oraz Windows
- Zarządzanie użytkownikami poprzez AD
- Tworzenie skryptów bash, python
- Konfiguracja urządzeń sieciowych
- Utrzymywanie środowiska laboratywnego
- Wsparcie dla klienta w ramach prowadzonych projektów
- Uczestnictwo w próbach FAT, HAT, SAT, SIT, PZO
- Minimum dwa lata doświadczenia na stanowisku SysOps lub Inżyniera wdrożeń/Inżyniera sieci
- Doskonała, praktyczna znajomość systemów Linux oraz Windows w zakresie instalacji, konfiguracji, troubleshootingu
- Umiejętność pisania skryptów automatyzujących w Bash/Python/Ansible
- Znajomość podstawowych zagadnień sieciowych (TCP/IP, UDP, DNS, NAT, ACL)
- Znajomość zagadnień i narzędzi wirtualizacyjnych VMware vSphere
- Znajomość narzędzi do diagnostyki sieciowej systemu tj....
- Dofinansowanie zajęć sportowych
- Prywatna opieka medyczna
- Dofinansowanie nauki języków
- Dofinansowanie szkoleń i kursów
- Ubezpieczenie na życie
- Możliwość pracy zdalnej
- Elastyczny czas pracy
- Brak dress code’u
- Parking dla pracowników
- Strefa relaksu
- Dofinansowanie wypoczynku
- Dofinansowanie wakacji dzieci
Senior Kotlin Developer / Tech Lead (KotlinPython) Miejsce pracy: Kraków Technologies we use Expected Kotlin Python About the project Salary: 1000-1400 pln/day on B2B Work mode: elastic hybrid (1 day per week from Gdynia/Warszawa/Kraków/Rzeszów) You will join the Language Intelligence team, responsible for building and scaling intelligent solutions supporting language-driven features within the product ecosystem. The team works in a cross-functional setup, closely collaborating with Product,... (więcej)
- Lead and develop the Language Intelligence engineering team
- Take end-to-end ownership of delivery, quality, and reliability
- Drive architectural decisions and technical direction
- Ensure engineering excellence standards (code quality, testing, CI/CD, production readiness)
- Collaborate closely with Product, UX, and Business stakeholders
- Translate business needs into technical solutions and execution plans
- Participate in product discovery as a technical partner
- Conduct code reviews ...
- Proven experience leading a software development team (people leadership, delivery ownership)
- Minimum 7 years of experience with Kotlin and experience with Python
- Ability to contribute to architecture decisions and conduct high-quality code reviews
- Experience in ensuring engineering excellence (clean code, testing strategy, CI/CD, production readiness)
- Strong ownership mindset and accountability for outcomes, quality, and reliability
- Ability to translate business goals into clear ...
- Real impact on product direction as a technical partner in product discovery
- Strong ownership and autonomy in technical and delivery decisions
- Possibility to work on AI-powered and innovative solutions
- Contribute to the development of a leading e-commerce advertising platform
Senior Kotlin Developer / Tech Lead (KotlinPython) Miejsce pracy: Rzeszów Technologies we use Expected Kotlin Python About the project Salary: 1000-1400 pln/day on B2B Work mode: elastic hybrid (1 day per week from Gdynia/Warszawa/Kraków/Rzeszów) You will join the Language Intelligence team, responsible for building and scaling intelligent solutions supporting language-driven features within the product ecosystem. The team works in a cross-functional setup, closely collaborating with Product,... (więcej)
- Lead and develop the Language Intelligence engineering team
- Take end-to-end ownership of delivery, quality, and reliability
- Drive architectural decisions and technical direction
- Ensure engineering excellence standards (code quality, testing, CI/CD, production readiness)
- Collaborate closely with Product, UX, and Business stakeholders
- Translate business needs into technical solutions and execution plans
- Participate in product discovery as a technical partner
- Conduct code reviews ...
- Proven experience leading a software development team (people leadership, delivery ownership)
- Minimum 7 years of experience with Kotlin and experience with Python
- Ability to contribute to architecture decisions and conduct high-quality code reviews
- Experience in ensuring engineering excellence (clean code, testing strategy, CI/CD, production readiness)
- Strong ownership mindset and accountability for outcomes, quality, and reliability
- Ability to translate business goals into clear ...
- Real impact on product direction as a technical partner in product discovery
- Strong ownership and autonomy in technical and delivery decisions
- Possibility to work on AI-powered and innovative solutions
- Contribute to the development of a leading e-commerce advertising platform
Data Solutions Business Analyst – Data Analytics & Cloud Integration
42 dni temu przez ITDS Polska Sp. z o.o.
Data Solutions Business Analyst – Data Analytics & Cloud Integration Miejsce pracy: Kraków Technologies we use Expected Google Cloud Platform Hadoop Operating system Windows About the project As a Data Solutions Business Analyst – Data Analytics & Cloud Integration, you will be working for our client, an international leader in innovative data management and analytics. You will lead initiatives to enhance data ingestion, data quality, and integration with platforms like Google Cloud and... (więcej)
- Elicit requirements using interviews, workshops, and process analysis to translate complex business needs into effective data solutions.
- Collaborate with stakeholders to define project scope, objectives, and deliverables aligned with strategic goals.
- Work closely with data engineers, project teams, and business users to ensure clear communication, documentation, and successful project execution.
- Analyze, manipulate, and document data, creating technical specifications and data ...
- Minimum of 3 years of experience in Business Analysis, Data Analytics, or Data Platform roles.
- Strong understanding of ESG data and the data lifecycle in general.
- Proven experience delivering technically focused analysis for change initiatives involving multiple organizations.
- Solid knowledge of cloud platforms, particularly Google Cloud, Hadoop, or related big data tools.
- Hands-on experience with data integration solutions and third-party data sources.
- Excellent communication skills...
- Stable and long-term cooperation with very good conditions.
- Enhance your skills and develop your expertise in the financial industry.
- Work on the most strategic projects available in the market.
- Define your career roadmap and develop yourself in the best and fastest possible way by delivering strategic projects for different clients of ITDS over several years.
- Participate in Social Events, training, and work in an international environment.
- Access to attractive Medical Package.
- ...
Cloud Infrastructure DBA Miejsce pracy: Katowice Technologies we use Expected AWS Python Bash Go Linux Terraform Ansible PostgreSQL Operating system Linux About the project Technologies used in our team: AWS, Terraform, Kubernetes, Ansible, Elasticsearch, Kafka, PostgreSQL, VictoriaMetrics, Hashicorp stack, Gitlab, Jenkins, Nexus Your responsibilities Being first of all a DevOps and then a DBA Act as the liaison between development teams and DevOps Act as a mediator and advisor in determining... (więcej)
- Being first of all a DevOps and then a DBA
- Act as the liaison between development teams and DevOps
- Act as a mediator and advisor in determining the division of responsibilities between infrastructure and application layers
- Designing database structure and finding common grounds with other teams on the technical level
- Implementing backup and restore, disaster recovery and business continuity strategies within database clusters
- Building and setting up new database clusters
- ...
- Working knowledge of AWS, sufficient for efficient day-to-day operations
- Good knowledge of Python/Bash/Go
- Good knowledge of Linux
- Experience with Terraform and/or Ansible
- Strong knowledge and practical experience as an administrator (DBA) with PostgreSQL
- Optional: Bachelor’s degree in computer science, information technology, or other related field of study
- Form of employment of your choosing (B2B, CoE)
- Lloyds insurance (in case of cooperation on a B2B basis)
- Subsidy for the purchase of glasses (300 PLN/year)
- Free parking 3 minutes from the office or shared underground parking (can be reserved - first come, first served)
- Private medical care with dentists package for you and your family
- Group life insurance for you and your partner
- Multisport card as part of the MyBenefit package
- Dell laptop, keyboard, mouse, wireless headphones, ...
Specjalista ds. sprzedaży systemu klasy ERP Miejsce pracy: Gdynia Twój zakres obowiązków Aktywne pozyskiwanie partnerów handlowych branży IT; Rozwój sieci handlowej na rynku; Utrzymanie pozytywnych relacji z obecnymi klientami; Aktywna sprzedaż nowych technologii firmy ProkHard; Prowadzenie prezentacji produktów dla klientów; Przygotowywanie ofert, prowadzenie prezentacji handlowych, negocjacji handlowych, obejmujące zakres oferty i sposobu finansowania; Współpraca przy realizacji... (więcej)