List of job offers
Solution Architect Miejsce pracy: Poznań Technologies we use Expected Apache Airflow Python Google Cloud Platform About the project The role is responsible for leading & managing the GCP development, assigning the daily works and output aligned to project delivery, tracking the technical delivery on time delivery with quality. Leading the technical delivery from Solution requirement understanding BRD, FSD walked through review and any technical solution gaps fill , leading the GCP pipeline code... (more)
- Lead and manage GCP development activities, ensuring timely and high-quality technical delivery.
- Develop, maintain, and optimize data pipelines using Apache Airflow and Python.
- Implement orchestrated workflows using Google Composer.
- Integrate and manage cloud services including GCS, Cloud Functions, and Cloud Run.
- Design data solutions and provide architectural guidance based on best practices.
- Prepare and maintain documentation, including functional and technical verification.
- ...
- Strong experience in developing and managing data pipelines using Apache Airflow.
- Proficiency in Google Cloud Platform (GCP) services, including GCS, Cloud Functions, and Cloud Run.
- Solid skills in Python for pipeline development.
- Ability to design scalable and efficient data solutions as a solution architect.
- Experience with data modelling and implementing data quality checks.
- Ability to create and maintain technical documentation.
- Strong customer interaction and communication ...
- Private medical care with Medicover with additional packages (e.g., dental, senior care, oncology) available on preferential terms.
- Life insurance and 40 options on our NAIS benefit platform, including Netflix, Spotify, or Sports card.
- Access to over 70 training tracks with certification opportunities (e.g., GenAI, Excel, Business Analysis, Project Management) on our NEXT platform.
- Free access to Education First languages platform, TED Talks, and Udemy Business materials and trainings. -...
Solution Architect Miejsce pracy: Lublin Technologies we use Expected Apache Airflow Python Google Cloud Platform About the project The role is responsible for leading & managing the GCP development, assigning the daily works and output aligned to project delivery, tracking the technical delivery on time delivery with quality. Leading the technical delivery from Solution requirement understanding BRD, FSD walked through review and any technical solution gaps fill , leading the GCP pipeline code... (more)
- Lead and manage GCP development activities, ensuring timely and high-quality technical delivery.
- Develop, maintain, and optimize data pipelines using Apache Airflow and Python.
- Implement orchestrated workflows using Google Composer.
- Integrate and manage cloud services including GCS, Cloud Functions, and Cloud Run.
- Design data solutions and provide architectural guidance based on best practices.
- Prepare and maintain documentation, including functional and technical verification.
- ...
- Strong experience in developing and managing data pipelines using Apache Airflow.
- Proficiency in Google Cloud Platform (GCP) services, including GCS, Cloud Functions, and Cloud Run.
- Solid skills in Python for pipeline development.
- Ability to design scalable and efficient data solutions as a solution architect.
- Experience with data modelling and implementing data quality checks.
- Ability to create and maintain technical documentation.
- Strong customer interaction and communication ...
- Private medical care with Medicover, including additional packages (e.g., dental, senior care, oncology) available on preferential terms.
- Life insurance and 40 options on the NAIS benefit platform, including Netflix, Spotify, or Sports card.
- Access to over 70 training tracks with certification opportunities (e.g., GenAI, Excel, Business Analysis, Project Management) on the NEXT platform.
- Free access to Education First languages platform, TED Talks, and Udemy Business materials and ...
Solution Architect Miejsce pracy: Kraków Technologies we use Expected Apache Airflow Python Google Cloud Platform About the project The role is responsible for leading & managing the GCP development, assigning the daily works and output aligned to project delivery, tracking the technical delivery on time delivery with quality. Leading the technical delivery from Solution requirement understanding BRD, FSD walked through review and any technical solution gaps fill , leading the GCP pipeline code... (more)
- Lead and manage GCP development activities, ensuring timely and high-quality technical delivery.
- Develop, maintain, and optimize data pipelines using Apache Airflow and Python.
- Implement orchestrated workflows using Google Composer.
- Integrate and manage cloud services including GCS, Cloud Functions, and Cloud Run.
- Design data solutions and provide architectural guidance based on best practices.
- Prepare and maintain documentation, including functional and technical verification.
- ...
- Strong experience in developing and managing data pipelines using Apache Airflow.
- Proficiency in Google Cloud Platform (GCP) services, including GCS, Cloud Functions, and Cloud Run.
- Solid skills in Python for pipeline development.
- Ability to design scalable and efficient data solutions as a solution architect.
- Experience with data modelling and implementing data quality checks.
- Ability to create and maintain technical documentation.
- Strong customer interaction and communication ...
- Private medical care with Medicover with additional packages (e.g., dental, senior care, oncology) available on preferential terms.
- Life insurance and 40 options on our NAIS benefit platform, including Netflix, Spotify, or Sports card.
- Access to over 70 training tracks with certification opportunities (e.g., GenAI, Excel, Business Analysis, Project Management) on our NEXT platform.
- Free access to Education First languages platform, TED Talks, and Udemy Business materials and trainings. -...
Solution Architect Miejsce pracy: Katowice Technologies we use Expected Apache Airflow Python Google Cloud Platform About the project The role is responsible for leading & managing the GCP development, assigning the daily works and output aligned to project delivery, tracking the technical delivery on time delivery with quality. Leading the technical delivery from Solution requirement understanding BRD, FSD walked through review and any technical solution gaps fill , leading the GCP pipeline... (more)
- Lead and manage GCP development activities, ensuring timely and high-quality technical delivery.
- Develop, maintain, and optimize data pipelines using Apache Airflow and Python.
- Implement orchestrated workflows using Google Composer.
- Integrate and manage cloud services including GCS, Cloud Functions, and Cloud Run.
- Design data solutions and provide architectural guidance based on best practices.
- Prepare and maintain documentation, including functional and technical verification.
- ...
- Strong experience in developing and managing data pipelines using Apache Airflow.
- Proficiency in Google Cloud Platform (GCP) services, including GCS, Cloud Functions, and Cloud Run.
- Solid skills in Python for pipeline development.
- Ability to design scalable and efficient data solutions as a solution architect.
- Experience with data modelling and implementing data quality checks.
- Ability to create and maintain technical documentation.
- Strong customer interaction and communication ...
- Private medical care with Medicover with additional packages (e.g., dental, senior care, oncology) available on preferential terms.
- Life insurance and 40 options on the NAIS benefit platform, including Netflix, Spotify, or Sports card.
- Access to over 70 training tracks with certification opportunities (e.g., GenAI, Excel, Business Analysis, Project Management) on the NEXT platform.
- Free access to Education First languages platform, TED Talks, and Udemy Business materials and trainings. -...
Solution Architect Miejsce pracy: Gdańsk Technologies we use Expected Apache Airflow Python Google Cloud Platform About the project The role is responsible for leading & managing the GCP development, assigning the daily works and output aligned to project delivery, tracking the technical delivery on time delivery with quality. Leading the technical delivery from Solution requirement understanding BRD, FSD walked through review and any technical solution gaps fill , leading the GCP pipeline code... (more)
- Lead and manage GCP development activities, ensuring timely and high-quality technical delivery.
- Develop, maintain, and optimize data pipelines using Apache Airflow and Python.
- Implement orchestrated workflows using Google Composer.
- Integrate and manage cloud services including GCS, Cloud Functions, and Cloud Run.
- Design data solutions and provide architectural guidance based on best practices.
- Prepare and maintain documentation, including functional and technical verification.
- ...
- Strong experience in developing and managing data pipelines using Apache Airflow.
- Proficiency in Google Cloud Platform (GCP) services, including GCS, Cloud Functions, and Cloud Run.
- Solid skills in Python for pipeline development.
- Ability to design scalable and efficient data solutions as a solution architect.
- Experience with data modelling and implementing data quality checks.
- Ability to create and maintain technical documentation.
- Strong customer interaction and communication ...
- Private medical care with Medicover with additional packages (e.g., dental, senior care, oncology) available on preferential terms.
- Life insurance and 40 options on our NAIS benefit platform, including Netflix, Spotify or Sports card.
- Access to over 70 training tracks with certification opportunities (e.g., GenAI, Excel, Business Analysis, Project Management) on our NEXT platform.
- Free access to Education First languages platform, TED Talks, and Udemy Business materials and trainings.
- ...
Terraform Infrastructure DevOps Engineer Miejsce pracy: Kraków Technologies we use Expected Terraform Ansible Python PowerShell Docker Git YAML REST APIs About the project You will work on building and improving automated infrastructure environments using modern Infrastructure as Code practices. You will collaborate with teams to design and deliver stable, secure, and scalable solutions. You will also create automation, optimize deployment processes, and support continuous improvements in the... (more)
- Practical experience with Infrastructure as Code and automation.
- Hands-on skills with Terraform and Ansible.
- Ability to work with Python, PowerShell, CI/CD tools, Git, Docker, and YAML.
- Understanding of Linux/Windows systems, networking basics, and virtualization.
- Experience with SQL or storage/backup platforms is an advantage.
- Analytical mindset and willingness to improve processes.
- Ability to collaborate effectively in a team and share knowledge.
- Design and maintain ...
- Experience with Terraform and Ansible.
- Proficiency in Python and PowerShell.
- Familiarity with CI/CD tools, Git, Docker, and YAML.
- Understanding of Linux/Windows systems and networking basics.
- Experience with SQL or storage/backup platforms is a plus.
- Strong analytical skills and a collaborative mindset.
- Yearly financial bonus.
- Private medical care with Medicover, including additional packages (e.g., dental, senior care, oncology) available on preferential terms.
- Life insurance and access to the NAIS benefit platform.
- Access to over 70 training tracks with certification opportunities (e.g., GenAI, Excel, Business Analysis, Project Management) on the NEXT platform.
- Free access to Education First languages platform, Pluralsight, TED Talks, Coursera, and Udemy Business materials and ...
DevOps Engineer Miejsce pracy: Kraków Technologie, których używamy Wymagane OpenShift Container Platform (OCP CI/CD IaC Linux/Unix Mile widziane Certyfikaty Red Hat OpenShift Administrator / Engineer Prometheus Grafana O projekcie Rola w zespole infrastruktury działającym w dużej organizacji finansowej. Zespół współpracuje z wieloma zespołami produktowymi, odpowiadając za stabilność, bezpieczeństwo i rozwój środowisk kontenerowych. To stanowisko dla osoby, która swobodnie... (more)
- Utrzymanie i rozwój środowisk opartych o OpenShift Container Platform
- Projektowanie i automatyzacja pipeline’ów CI/CD (Jenkins, GitLab CI, ArgoCD)
- Tworzenie infrastruktury jako kod (Terraform) oraz automatyzacja konfiguracji (Ansible)
- Deployment aplikacji przy użyciu Helm
- Monitorowanie klastrów i aplikacji (Prometheus, Grafana)
- Reagowanie na incydenty i optymalizacja wydajności
- Współpraca z zespołami developerskimi przy konteneryzacji aplikacji
- Zapewnienie zgodności z...
- Minimum 4 lata doświadczenia w roli DevOps / Platform Engineer
- Praktyczna, samodzielna praca z OpenShift (OCP)
- Doświadczenie w projektowaniu i utrzymaniu pipeline’ów CI/CD
- Znajomość narzędzi automatyzacji (Terraform, Ansible, Helm)
- Bardzo dobra znajomość Linux oraz podstaw sieci
- Doświadczenie w pracy ze środowiskami produkcyjnymi
- Mile widziane Certyfikaty Red Hat OpenShift
- Doświadczenie z monitoringiem (Prometheus, Grafana, ELK)
- Praca w środowisku regulowanym (np....
- Stabilny projekt w dużej organizacji
- Realny wpływ na architekturę i procesy CI/CD
- Pracę z nowoczesnym stackiem OpenShift, GitOps
- Hybrydowy model pracy (4–6 dni w miesiącu z biura)
- Dofinansowanie zajęć sportowych
- Prywatna opieka medyczna
DevOps Engineer Miejsce pracy: Warszawa Technologie, których używamy Wymagane OpenShift Container Platform (OCP CI/CD IaC Linux/Unix Mile widziane Certyfikaty Red Hat OpenShift Administrator / Engineer Prometheus Grafana O projekcie Rola w zespole infrastruktury działającym w dużej organizacji finansowej. Zespół współpracuje z wieloma zespołami produktowymi, odpowiadając za stabilność, bezpieczeństwo i rozwój środowisk kontenerowych. To stanowisko dla osoby, która swobodnie... (more)
- Utrzymanie i rozwój środowisk opartych o OpenShift Container Platform
- Projektowanie i automatyzacja pipeline’ów CI/CD (Jenkins, GitLab CI, ArgoCD)
- Tworzenie infrastruktury jako kod (Terraform) oraz automatyzacja konfiguracji (Ansible)
- Deployment aplikacji przy użyciu Helm
- Monitorowanie klastrów i aplikacji (Prometheus, Grafana)
- Reagowanie na incydenty i optymalizacja wydajności
- Współpraca z zespołami developerskimi przy konteneryzacji aplikacji
- Zapewnienie zgodności z...
- Minimum 4 lata doświadczenia w roli DevOps / Platform Engineer
- Praktyczna, samodzielna praca z OpenShift (OCP)
- Doświadczenie w projektowaniu i utrzymaniu pipeline’ów CI/CD
- Znajomość narzędzi automatyzacji (Terraform, Ansible, Helm)
- Bardzo dobra znajomość Linux oraz podstaw sieci
- Doświadczenie w pracy ze środowiskami produkcyjnymi
- Mile widziane Certyfikaty Red Hat OpenShift
- Doświadczenie z monitoringiem (Prometheus, Grafana, ELK)
- Praca w środowisku regulowanym (np....
- Stabilny projekt w dużej organizacji
- Realny wpływ na architekturę i procesy CI/CD
- Pracę z nowoczesnym stackiem OpenShift, GitOps
- Hybrydowy model pracy (4–6 dni w miesiącu z biura)
- Dofinansowanie zajęć sportowych
- Prywatna opieka medyczna
Starszy Inżynier Wsparcia Systemów i Aplikacji (L2) (m/k/n)
16 days ago by UPVANTA SPÓŁKA Z OGRANICZONĄ ODPOWIEDZIALNOŚCIĄ
Starszy Inżynier Wsparcia Systemów i Aplikacji (L2) (m/k/n) Miejsce pracy: Wrocław Technologie, których używamy Wymagane Windows Server Linux Tomcat JBoss Apache IIS Nginx Dynatrace Zabbix Grafana Prometheus RabbitMQ Kafka Postman SoapUI Bash PowerShell JIRA ServiceNow IBM SCCD Mile widziane Kubernetes Docker VMware Jenkins GitLab CI GitHub Actions Azure Terraform Helm O projekcie Widełki 800-900 zł MD B2B Poszukujemy doświadczonej osoby na stanowisko Starszego Inżyniera Wsparcia... (more)
Big Data Developer Spark Miejsce pracy: Warszawa Technologie, których używamy Wymagane Apache Spark Hive SQL Mile widziane Git Airflow Azure O projekcie Razem z naszym Partnerem, jednym z europejskich liderów w branży bankowej, poszukujemy dwóch osób na stanowisko Big Data Engineer. Projekty dotyczą bankowości B2B i B2C ➡️ platform dla dużych kredytów, pożyczek profesjonalnych oraz aplikacji wspierających transformację cyfrową. Twój zakres obowiązków Projektowanie i... (more)
IT Onsite Specialist (m/f) Miejsce pracy: Wrocław Twój zakres obowiązków Site IT Coordination: To support the other assigned locations remotely by working with Site IT Coordinators Collaborate with team members to understand organizational and operational challenges Manage technical difficulties, working with vendors when necessary Act as the local IT liaison for central IT Digital Workplace in both white-collar and blue-collar environments: End user training and onboarding for new employees... (more)
- Site IT Coordination: Support assigned locations remotely by working with Site IT Coordinators.
- Collaborate with team members to understand organizational and operational challenges.
- Manage technical difficulties, working with vendors when necessary.
- Act as the local IT liaison for central IT.
- Digital Workplace in both white-collar and blue-collar environments:
- End user training and onboarding for new employees and new technology.
- Drive various projects within central IT.
- Client ...
- Minimum 5 years of experience within the IT area.
- Education: University degree in Information Technologies or similar.
- Microsoft 365 knowledge.
- ServiceNow knowledge.
- Research, analytical, and problem-solving skills.
- Knowledge of and experience troubleshooting/supporting Windows 11, mobile devices.
- Networking skills such as LAN/WAN network configuration and troubleshooting, as well as VPN client connectivity.
- Good verbal/writing skills in both Polish and English.
- Ability to ...
- Stable employment in a global industry-leading company.
- Annual bonus and a competitive salary aligned with experience and potential.
- Comprehensive benefits package including:
- Private medical care.
- Co-financing of Multisport Card.
- Life and health insurance.
- Opportunities for personal growth and professional development.
- A supportive company culture built on trust, collaboration, and mutual respect.
- Additional benefits such as:
- Co-financing of sports activities.
- Private ...
AWS Devops Engineer Miejsce pracy: Kraków Technologie, których używamy Wymagane AWS Git Ansible Terraform System operacyjny Windows Linux Twój zakres obowiązków Współtworzenie i rozwój core’owych usług oraz narzędzi platformowych (Terraform, Python, JavaScript / React) Budowanie i automatyzacja narzędzi umożliwiających zespołom projektowym tworzenie bezpiecznych i wydajnych rozwiązań (AWS SSM Automations, Ansible, CodePipeline, CodeBuild, EC2 Image Builder, Lambda, Step... (more)
Inżynier / Inżynierka Machine Learning / MLOps (Middle) Miejsce pracy: Warszawa Technologie, których używamy Wymagane Python Linux ML LLM Kubernetes Mile widziane MLOps LLMOps Big Data O projekcie Dołączysz do kilkuosobowego zespołu działającego w ramach MLOps i Formacji Sztucznej Inteligencji w dużej instytucji finansowej. Projekt dotyczy rozwiązań AI wykorzystywanych w skali enterprise. Twoją rolą będzie przenoszenie modeli z fazy eksperymentalnej do stabilnego środowiska... (more)
- Projektowanie i implementacja aplikacji ML/LLM
- Wdrażanie i utrzymanie rozwiązań na środowiskach Linux, Kubernetes, GCP
- Rozwiązywanie problemów produkcyjnych (wydajność, skalowanie, monitoring)
- Współtworzenie standardów MLOps w organizacji
- Udział w inicjatywach rozwojowych zespołu AI
- Bardzo dobra znajomość Python (kod produkcyjny, nie tylko notebooki)
- Doświadczenie we wdrażaniu aplikacji ML/LLM na środowiska produkcyjne
- Bardzo dobra znajomość Linux (Debian / RHEL) – praca w terminalu, analiza logów, diagnostyka problemów
- Praktyczne doświadczenie z Kubernetes – deployment, utrzymanie, troubleshooting aplikacji konteneryzowanych
- Doświadczenie z Google Cloud Platform (GCP)
- Umiejętność projektowania i implementacji aplikacji z wykorzystaniem Machine...
- Pracę z nowoczesnymi technologiami ML i LLM w dużej organizacji finansowej
- Stabilny, długofalowy projekt (12 miesięcy)
- Elastyczny model pracy – tylko 1 dzień w tygodniu w biurze
- Realny wpływ na rozwój rozwiązań AI w skali enterprise
- Możliwość rozwoju kompetencji w obszarze ML/MLOps
- Dofinansowanie zajęć sportowych
- Prywatna opieka medyczna
Data Scientist Miejsce pracy: Kraków Technologies we use Expected SQL Python Optional Apache Spark AWS Microsoft Azure Google Cloud Platform TensorFlow PyTorch Keras Optuna Polars Operating system Windows About the project Do you want to work on end-to-end projects that truly impact business decisions? We’re looking for a Data Scientist who can turn data into real business value and implement innovative solutions. Your responsibilities Design and implement machine learning models (predictive,... (more)
- Design and implement machine learning models (predictive, classification, segmentation, recommendation, optimization, NLP).
- Perform data analysis, exploration, and prepare datasets for modeling.
- Prototype and test analytical approaches.
- Develop and deploy production-grade ML solutions.
- Collaborate with clients: requirements analysis, presenting concepts and results.
- Stay up to date with new technologies, libraries, and ML trends; propose innovative solutions.
- University degree in quantitative fields (econometrics, mathematics, computer science, quantitative methods).
- Minimum 3 years of experience as a Data Scientist in commercial projects.
- Strong data processing skills (SQL, Python).
- Solid knowledge of ML algorithms and hands-on implementation in Python.
- Very good command of Python (OOP, code optimization).
- Ability to communicate with business stakeholders and present results clearly.
- Languages & tools: Python, Jupyter/Zeppelin, SQL, ...
- Flexible cooperation model – choose the form that suits you best (B2B, employment contract, etc.).
- Hybrid work setup – 5 days per month in Kraków office.
- Collaborative team culture – work alongside experienced professionals eager to share knowledge.
- Continuous development – access to training platforms and growth opportunities.
- Comprehensive benefits – including Interpolska Health Care, Multisport card, Warta Insurance, and more.
- High quality equipment – laptop and ...
Remote work
Administrator IT (K/M) Miejsce pracy: Czaple (pow. kartuski) Technologie, których używamy Wymagane Linux System operacyjny Windows O projekcie Projekty transformacji cyfrowej – w których będziesz uczestniczyć •Standaryzacja sieci i bezpieczeństwa: segmentacja, separacja ruchu biurowego i produkcyjnego, polityki dostępu, centralizacja logów i obserwowalności. •Modernizacja wirtualizacji i konteneryzacji: konsolidacja hostów, HA na Proxmox, konteneryzacja wybranych usług, podstawy... (more)