List of job offers
DevOps Engineer / Java Developer (m/k/d)
16 days ago by UPVANTA SPÓŁKA Z OGRANICZONĄ ODPOWIEDZIALNOŚCIĄ
DevOps Engineer / Java Developer (m/k/d) Miejsce pracy: Wrocław Technologie, których używamy Wymagane Microsoft Azure Ubuntu RedHat Bash Ansible Terraform Docker Kubernetes Mile widziane JavaScript HTML CSS SQL Angular Java O projekcie Poszukujemy doświadczonego specjalisty na stanowisko DevOps Engineer / Java Developer, który będzie odpowiedzialny za utrzymanie i rozwój środowisk chmurowych, automatyzację procesów CI/CD oraz wsparcie rozwoju aplikacji webowych w technologii Java. Rola... (more)
- Zarządzanie i utrzymanie środowisk w Microsoft Azure (VM, networking, storage, RBAC).
- Administracja systemami Linux (Ubuntu, RedHat).
- Projektowanie i utrzymanie procesów CI/CD z wykorzystaniem Jenkins.
- Automatyzacja infrastruktury (Bash, Ansible, Terraform – IaC).
- Zarządzanie repozytoriami artefaktów (np. Nexus).
- Wdrażanie i utrzymanie konteneryzacji (Docker) oraz orkiestracji (Kubernetes).
- Monitorowanie systemów, analiza logów i troubleshooting środowisk testowych oraz ...
- Min. 3 lata doświadczenia jako DevOps / Cloud / Infrastructure Engineer.
- Praktyczne doświadczenie z Microsoft Azure.
- Bardzo dobra znajomość administracji systemami Linux (Ubuntu, RedHat).
- Doświadczenie w zakresie backupu i rozwiązań DR.
- Znajomość zagadnień szyfrowania danych i zarządzania kluczami.
- Doświadczenie w pracy z Jenkins oraz budowie pipeline’ów CI/CD.
- Umiejętność analizy logów i rozwiązywania problemów w środowiskach CI/CD i produkcyjnych.
- ...
- B2B: 1000 - 1240 zł netto/dziennie.
Mile widziane
- Komercyjne doświadczenie w tworzeniu aplikacji webowych w Java (backend) oraz frontend.
- Znajomość OOP w aplikacjach webowych (mile widziane elementy programowania funkcyjnego).
- Doświadczenie z web services / API (SOAP, REST).
- Dobra znajomość JavaScript, HTML5, CSS3 oraz responsive single-page design.
- Znajomość baz danych i SQL.
- Doświadczenie z Angular.
- Znajomość zasad Clean Code i Unit Testing.
- Doświadczenie w ...
Solution Architect Miejsce pracy: Wrocław Technologies we use Expected Apache Airflow Python Google Cloud Platform About the project The role is responsible for leading & managing the GCP development, assigning the daily works and output aligned to project delivery, tracking the technical delivery on time delivery with quality. Leading the technical delivery from Solution requirement understanding BRD, FSD walked through review and any technical solution gaps fill , leading the GCP pipeline... (more)
- Lead and manage GCP development activities, ensuring timely and high-quality technical delivery.
- Develop, maintain, and optimize data pipelines using Apache Airflow and Python.
- Implement orchestrated workflows using Google Composer.
- Integrate and manage cloud services including GCS, Cloud Functions, and Cloud Run.
- Design data solutions and provide architectural guidance based on best practices.
- Prepare and maintain documentation, including functional and technical verification.
- ...
- Strong experience in developing and managing data pipelines using Apache Airflow.
- Proficiency in Google Cloud Platform (GCP) services, including GCS, Cloud Functions, and Cloud Run.
- Solid skills in Python for pipeline development.
- Ability to design scalable and efficient data solutions as a solution architect.
- Experience with data modelling and implementing data quality checks.
- Ability to create and maintain technical documentation.
- Strong customer interaction and communication ...
- Private medical care with Medicover with additional packages (e.g., dental, senior care, oncology) available on preferential terms.
- Life insurance and 40 options on the NAIS benefit platform, including Netflix, Spotify, or Sports card.
- Access to over 70 training tracks with certification opportunities (e.g., GenAI, Excel, Business Analysis, Project Management) on the NEXT platform.
- Free access to Education First languages platform, TED Talks, and Udemy Business materials and trainings. -...
Solution Architect Miejsce pracy: Warszawa Technologies we use Expected Apache Airflow Python Google Cloud Platform About the project The role is responsible for leading & managing the GCP development, assigning the daily works and output aligned to project delivery, tracking the technical delivery on time delivery with quality. Leading the technical delivery from Solution requirement understanding BRD, FSD walked through review and any technical solution gaps fill , leading the GCP pipeline... (more)
- Lead and manage GCP development activities, ensuring timely and high-quality technical delivery.
- Develop, maintain, and optimize data pipelines using Apache Airflow and Python.
- Implement orchestrated workflows using Google Composer.
- Integrate and manage cloud services including GCS, Cloud Functions, and Cloud Run.
- Design data solutions and provide architectural guidance based on best practices.
- Prepare and maintain documentation, including functional and technical verification.
- ...
- Strong experience in developing and managing data pipelines using Apache Airflow.
- Proficiency in Google Cloud Platform (GCP) services, including GCS, Cloud Functions, and Cloud Run.
- Solid skills in Python for pipeline development.
- Ability to design scalable and efficient data solutions as a solution architect.
- Experience with data modelling and implementing data quality checks.
- Ability to create and maintain technical documentation.
- Strong customer interaction and communication ...
- Practical benefits: private medical care with Medicover with additional packages (e.g., dental, senior care, oncology) available on preferential terms, life insurance, and 40 options on our NAIS benefit platform, including Netflix, Spotify or Sports card.
- Access to over 70 training tracks with certification opportunities (e.g., GenAI, Excel, Business Analysis, Project Management) on our NEXT platform.
- Free access to Education First languages platform, TED Talks, and Udemy Business ...
Solution Architect Miejsce pracy: Opole Technologies we use Expected Apache Airflow Python Google Cloud Platform About the project The role is responsible for leading & managing the GCP development, assigning the daily works and output aligned to project delivery, tracking the technical delivery on time delivery with quality. Leading the technical delivery from Solution requirement understanding BRD, FSD walked through review and any technical solution gaps fill , leading the GCP pipeline code... (more)
- Lead and manage GCP development activities, ensuring timely and high-quality technical delivery.
- Develop, maintain, and optimize data pipelines using Apache Airflow and Python.
- Implement orchestrated workflows using Google Composer.
- Integrate and manage cloud services including GCS, Cloud Functions, and Cloud Run.
- Design data solutions and provide architectural guidance based on best practices.
- Prepare and maintain documentation, including functional and technical verification.
- ...
- Strong experience in developing and managing data pipelines using Apache Airflow.
- Proficiency in Google Cloud Platform (GCP) services, including GCS, Cloud Functions, and Cloud Run.
- Solid skills in Python for pipeline development.
- Ability to design scalable and efficient data solutions as a solution architect.
- Experience with data modelling and implementing data quality checks.
- Ability to create and maintain technical documentation.
- Strong customer interaction and communication ...
- Private medical care with Medicover with additional packages (e.g., dental, senior care, oncology) available on preferential terms.
- Life insurance and 40 options on our NAIS benefit platform, including Netflix, Spotify, or Sports card.
- Access to over 70 training tracks with certification opportunities (e.g., GenAI, Excel, Business Analysis, Project Management) on our NEXT platform.
- Free access to Education First languages platform, TED Talks, and Udemy Business materials and trainings. -...
Solution Architect Miejsce pracy: Poznań Technologies we use Expected Apache Airflow Python Google Cloud Platform About the project The role is responsible for leading & managing the GCP development, assigning the daily works and output aligned to project delivery, tracking the technical delivery on time delivery with quality. Leading the technical delivery from Solution requirement understanding BRD, FSD walked through review and any technical solution gaps fill , leading the GCP pipeline code... (more)
- Lead and manage GCP development activities, ensuring timely and high-quality technical delivery.
- Develop, maintain, and optimize data pipelines using Apache Airflow and Python.
- Implement orchestrated workflows using Google Composer.
- Integrate and manage cloud services including GCS, Cloud Functions, and Cloud Run.
- Design data solutions and provide architectural guidance based on best practices.
- Prepare and maintain documentation, including functional and technical verification.
- ...
- Strong experience in developing and managing data pipelines using Apache Airflow.
- Proficiency in Google Cloud Platform (GCP) services, including GCS, Cloud Functions, and Cloud Run.
- Solid skills in Python for pipeline development.
- Ability to design scalable and efficient data solutions as a solution architect.
- Experience with data modelling and implementing data quality checks.
- Ability to create and maintain technical documentation.
- Strong customer interaction and communication ...
- Private medical care with Medicover with additional packages (e.g., dental, senior care, oncology) available on preferential terms.
- Life insurance and 40 options on our NAIS benefit platform, including Netflix, Spotify, or Sports card.
- Access to over 70 training tracks with certification opportunities (e.g., GenAI, Excel, Business Analysis, Project Management) on our NEXT platform.
- Free access to Education First languages platform, TED Talks, and Udemy Business materials and trainings. -...
Solution Architect Miejsce pracy: Lublin Technologies we use Expected Apache Airflow Python Google Cloud Platform About the project The role is responsible for leading & managing the GCP development, assigning the daily works and output aligned to project delivery, tracking the technical delivery on time delivery with quality. Leading the technical delivery from Solution requirement understanding BRD, FSD walked through review and any technical solution gaps fill , leading the GCP pipeline code... (more)
- Lead and manage GCP development activities, ensuring timely and high-quality technical delivery.
- Develop, maintain, and optimize data pipelines using Apache Airflow and Python.
- Implement orchestrated workflows using Google Composer.
- Integrate and manage cloud services including GCS, Cloud Functions, and Cloud Run.
- Design data solutions and provide architectural guidance based on best practices.
- Prepare and maintain documentation, including functional and technical verification.
- ...
- Strong experience in developing and managing data pipelines using Apache Airflow.
- Proficiency in Google Cloud Platform (GCP) services, including GCS, Cloud Functions, and Cloud Run.
- Solid skills in Python for pipeline development.
- Ability to design scalable and efficient data solutions as a solution architect.
- Experience with data modelling and implementing data quality checks.
- Ability to create and maintain technical documentation.
- Strong customer interaction and communication ...
- Private medical care with Medicover, including additional packages (e.g., dental, senior care, oncology) available on preferential terms.
- Life insurance and 40 options on the NAIS benefit platform, including Netflix, Spotify, or Sports card.
- Access to over 70 training tracks with certification opportunities (e.g., GenAI, Excel, Business Analysis, Project Management) on the NEXT platform.
- Free access to Education First languages platform, TED Talks, and Udemy Business materials and ...
Solution Architect Miejsce pracy: Kraków Technologies we use Expected Apache Airflow Python Google Cloud Platform About the project The role is responsible for leading & managing the GCP development, assigning the daily works and output aligned to project delivery, tracking the technical delivery on time delivery with quality. Leading the technical delivery from Solution requirement understanding BRD, FSD walked through review and any technical solution gaps fill , leading the GCP pipeline code... (more)
- Lead and manage GCP development activities, ensuring timely and high-quality technical delivery.
- Develop, maintain, and optimize data pipelines using Apache Airflow and Python.
- Implement orchestrated workflows using Google Composer.
- Integrate and manage cloud services including GCS, Cloud Functions, and Cloud Run.
- Design data solutions and provide architectural guidance based on best practices.
- Prepare and maintain documentation, including functional and technical verification.
- ...
- Strong experience in developing and managing data pipelines using Apache Airflow.
- Proficiency in Google Cloud Platform (GCP) services, including GCS, Cloud Functions, and Cloud Run.
- Solid skills in Python for pipeline development.
- Ability to design scalable and efficient data solutions as a solution architect.
- Experience with data modelling and implementing data quality checks.
- Ability to create and maintain technical documentation.
- Strong customer interaction and communication ...
- Private medical care with Medicover with additional packages (e.g., dental, senior care, oncology) available on preferential terms.
- Life insurance and 40 options on our NAIS benefit platform, including Netflix, Spotify, or Sports card.
- Access to over 70 training tracks with certification opportunities (e.g., GenAI, Excel, Business Analysis, Project Management) on our NEXT platform.
- Free access to Education First languages platform, TED Talks, and Udemy Business materials and trainings. -...
Solution Architect Miejsce pracy: Katowice Technologies we use Expected Apache Airflow Python Google Cloud Platform About the project The role is responsible for leading & managing the GCP development, assigning the daily works and output aligned to project delivery, tracking the technical delivery on time delivery with quality. Leading the technical delivery from Solution requirement understanding BRD, FSD walked through review and any technical solution gaps fill , leading the GCP pipeline... (more)
- Lead and manage GCP development activities, ensuring timely and high-quality technical delivery.
- Develop, maintain, and optimize data pipelines using Apache Airflow and Python.
- Implement orchestrated workflows using Google Composer.
- Integrate and manage cloud services including GCS, Cloud Functions, and Cloud Run.
- Design data solutions and provide architectural guidance based on best practices.
- Prepare and maintain documentation, including functional and technical verification.
- ...
- Strong experience in developing and managing data pipelines using Apache Airflow.
- Proficiency in Google Cloud Platform (GCP) services, including GCS, Cloud Functions, and Cloud Run.
- Solid skills in Python for pipeline development.
- Ability to design scalable and efficient data solutions as a solution architect.
- Experience with data modelling and implementing data quality checks.
- Ability to create and maintain technical documentation.
- Strong customer interaction and communication ...
- Private medical care with Medicover with additional packages (e.g., dental, senior care, oncology) available on preferential terms.
- Life insurance and 40 options on the NAIS benefit platform, including Netflix, Spotify, or Sports card.
- Access to over 70 training tracks with certification opportunities (e.g., GenAI, Excel, Business Analysis, Project Management) on the NEXT platform.
- Free access to Education First languages platform, TED Talks, and Udemy Business materials and trainings. -...
Solution Architect Miejsce pracy: Gdańsk Technologies we use Expected Apache Airflow Python Google Cloud Platform About the project The role is responsible for leading & managing the GCP development, assigning the daily works and output aligned to project delivery, tracking the technical delivery on time delivery with quality. Leading the technical delivery from Solution requirement understanding BRD, FSD walked through review and any technical solution gaps fill , leading the GCP pipeline code... (more)
- Lead and manage GCP development activities, ensuring timely and high-quality technical delivery.
- Develop, maintain, and optimize data pipelines using Apache Airflow and Python.
- Implement orchestrated workflows using Google Composer.
- Integrate and manage cloud services including GCS, Cloud Functions, and Cloud Run.
- Design data solutions and provide architectural guidance based on best practices.
- Prepare and maintain documentation, including functional and technical verification.
- ...
- Strong experience in developing and managing data pipelines using Apache Airflow.
- Proficiency in Google Cloud Platform (GCP) services, including GCS, Cloud Functions, and Cloud Run.
- Solid skills in Python for pipeline development.
- Ability to design scalable and efficient data solutions as a solution architect.
- Experience with data modelling and implementing data quality checks.
- Ability to create and maintain technical documentation.
- Strong customer interaction and communication ...
- Private medical care with Medicover with additional packages (e.g., dental, senior care, oncology) available on preferential terms.
- Life insurance and 40 options on our NAIS benefit platform, including Netflix, Spotify or Sports card.
- Access to over 70 training tracks with certification opportunities (e.g., GenAI, Excel, Business Analysis, Project Management) on our NEXT platform.
- Free access to Education First languages platform, TED Talks, and Udemy Business materials and trainings.
- ...
Terraform Infrastructure DevOps Engineer Miejsce pracy: Kraków Technologies we use Expected Terraform Ansible Python PowerShell Docker Git YAML REST APIs About the project You will work on building and improving automated infrastructure environments using modern Infrastructure as Code practices. You will collaborate with teams to design and deliver stable, secure, and scalable solutions. You will also create automation, optimize deployment processes, and support continuous improvements in the... (more)
- Practical experience with Infrastructure as Code and automation.
- Hands-on skills with Terraform and Ansible.
- Ability to work with Python, PowerShell, CI/CD tools, Git, Docker, and YAML.
- Understanding of Linux/Windows systems, networking basics, and virtualization.
- Experience with SQL or storage/backup platforms is an advantage.
- Analytical mindset and willingness to improve processes.
- Ability to collaborate effectively in a team and share knowledge.
- Design and maintain ...
- Experience with Terraform and Ansible.
- Proficiency in Python and PowerShell.
- Familiarity with CI/CD tools, Git, Docker, and YAML.
- Understanding of Linux/Windows systems and networking basics.
- Experience with SQL or storage/backup platforms is a plus.
- Strong analytical skills and a collaborative mindset.
- Yearly financial bonus.
- Private medical care with Medicover, including additional packages (e.g., dental, senior care, oncology) available on preferential terms.
- Life insurance and access to the NAIS benefit platform.
- Access to over 70 training tracks with certification opportunities (e.g., GenAI, Excel, Business Analysis, Project Management) on the NEXT platform.
- Free access to Education First languages platform, Pluralsight, TED Talks, Coursera, and Udemy Business materials and ...
DevOps Engineer Miejsce pracy: Kraków Technologie, których używamy Wymagane OpenShift Container Platform (OCP CI/CD IaC Linux/Unix Mile widziane Certyfikaty Red Hat OpenShift Administrator / Engineer Prometheus Grafana O projekcie Rola w zespole infrastruktury działającym w dużej organizacji finansowej. Zespół współpracuje z wieloma zespołami produktowymi, odpowiadając za stabilność, bezpieczeństwo i rozwój środowisk kontenerowych. To stanowisko dla osoby, która swobodnie... (more)
- Utrzymanie i rozwój środowisk opartych o OpenShift Container Platform
- Projektowanie i automatyzacja pipeline’ów CI/CD (Jenkins, GitLab CI, ArgoCD)
- Tworzenie infrastruktury jako kod (Terraform) oraz automatyzacja konfiguracji (Ansible)
- Deployment aplikacji przy użyciu Helm
- Monitorowanie klastrów i aplikacji (Prometheus, Grafana)
- Reagowanie na incydenty i optymalizacja wydajności
- Współpraca z zespołami developerskimi przy konteneryzacji aplikacji
- Zapewnienie zgodności z...
- Minimum 4 lata doświadczenia w roli DevOps / Platform Engineer
- Praktyczna, samodzielna praca z OpenShift (OCP)
- Doświadczenie w projektowaniu i utrzymaniu pipeline’ów CI/CD
- Znajomość narzędzi automatyzacji (Terraform, Ansible, Helm)
- Bardzo dobra znajomość Linux oraz podstaw sieci
- Doświadczenie w pracy ze środowiskami produkcyjnymi
- Mile widziane Certyfikaty Red Hat OpenShift
- Doświadczenie z monitoringiem (Prometheus, Grafana, ELK)
- Praca w środowisku regulowanym (np....
- Stabilny projekt w dużej organizacji
- Realny wpływ na architekturę i procesy CI/CD
- Pracę z nowoczesnym stackiem OpenShift, GitOps
- Hybrydowy model pracy (4–6 dni w miesiącu z biura)
- Dofinansowanie zajęć sportowych
- Prywatna opieka medyczna
DevOps Engineer Miejsce pracy: Warszawa Technologie, których używamy Wymagane OpenShift Container Platform (OCP CI/CD IaC Linux/Unix Mile widziane Certyfikaty Red Hat OpenShift Administrator / Engineer Prometheus Grafana O projekcie Rola w zespole infrastruktury działającym w dużej organizacji finansowej. Zespół współpracuje z wieloma zespołami produktowymi, odpowiadając za stabilność, bezpieczeństwo i rozwój środowisk kontenerowych. To stanowisko dla osoby, która swobodnie... (more)
- Utrzymanie i rozwój środowisk opartych o OpenShift Container Platform
- Projektowanie i automatyzacja pipeline’ów CI/CD (Jenkins, GitLab CI, ArgoCD)
- Tworzenie infrastruktury jako kod (Terraform) oraz automatyzacja konfiguracji (Ansible)
- Deployment aplikacji przy użyciu Helm
- Monitorowanie klastrów i aplikacji (Prometheus, Grafana)
- Reagowanie na incydenty i optymalizacja wydajności
- Współpraca z zespołami developerskimi przy konteneryzacji aplikacji
- Zapewnienie zgodności z...
- Minimum 4 lata doświadczenia w roli DevOps / Platform Engineer
- Praktyczna, samodzielna praca z OpenShift (OCP)
- Doświadczenie w projektowaniu i utrzymaniu pipeline’ów CI/CD
- Znajomość narzędzi automatyzacji (Terraform, Ansible, Helm)
- Bardzo dobra znajomość Linux oraz podstaw sieci
- Doświadczenie w pracy ze środowiskami produkcyjnymi
- Mile widziane Certyfikaty Red Hat OpenShift
- Doświadczenie z monitoringiem (Prometheus, Grafana, ELK)
- Praca w środowisku regulowanym (np....
- Stabilny projekt w dużej organizacji
- Realny wpływ na architekturę i procesy CI/CD
- Pracę z nowoczesnym stackiem OpenShift, GitOps
- Hybrydowy model pracy (4–6 dni w miesiącu z biura)
- Dofinansowanie zajęć sportowych
- Prywatna opieka medyczna
Starszy Inżynier Wsparcia Systemów i Aplikacji (L2) (m/k/n)
16 days ago by UPVANTA SPÓŁKA Z OGRANICZONĄ ODPOWIEDZIALNOŚCIĄ
Starszy Inżynier Wsparcia Systemów i Aplikacji (L2) (m/k/n) Miejsce pracy: Wrocław Technologie, których używamy Wymagane Windows Server Linux Tomcat JBoss Apache IIS Nginx Dynatrace Zabbix Grafana Prometheus RabbitMQ Kafka Postman SoapUI Bash PowerShell JIRA ServiceNow IBM SCCD Mile widziane Kubernetes Docker VMware Jenkins GitLab CI GitHub Actions Azure Terraform Helm O projekcie Widełki 800-900 zł MD B2B Poszukujemy doświadczonej osoby na stanowisko Starszego Inżyniera Wsparcia... (more)
Big Data Developer Spark Miejsce pracy: Warszawa Technologie, których używamy Wymagane Apache Spark Hive SQL Mile widziane Git Airflow Azure O projekcie Razem z naszym Partnerem, jednym z europejskich liderów w branży bankowej, poszukujemy dwóch osób na stanowisko Big Data Engineer. Projekty dotyczą bankowości B2B i B2C ➡️ platform dla dużych kredytów, pożyczek profesjonalnych oraz aplikacji wspierających transformację cyfrową. Twój zakres obowiązków Projektowanie i... (more)
IT Onsite Specialist (m/f) Miejsce pracy: Wrocław Twój zakres obowiązków Site IT Coordination: To support the other assigned locations remotely by working with Site IT Coordinators Collaborate with team members to understand organizational and operational challenges Manage technical difficulties, working with vendors when necessary Act as the local IT liaison for central IT Digital Workplace in both white-collar and blue-collar environments: End user training and onboarding for new employees... (more)
- Site IT Coordination: Support assigned locations remotely by working with Site IT Coordinators.
- Collaborate with team members to understand organizational and operational challenges.
- Manage technical difficulties, working with vendors when necessary.
- Act as the local IT liaison for central IT.
- Digital Workplace in both white-collar and blue-collar environments:
- End user training and onboarding for new employees and new technology.
- Drive various projects within central IT.
- Client ...
- Minimum 5 years of experience within the IT area.
- Education: University degree in Information Technologies or similar.
- Microsoft 365 knowledge.
- ServiceNow knowledge.
- Research, analytical, and problem-solving skills.
- Knowledge of and experience troubleshooting/supporting Windows 11, mobile devices.
- Networking skills such as LAN/WAN network configuration and troubleshooting, as well as VPN client connectivity.
- Good verbal/writing skills in both Polish and English.
- Ability to ...
- Stable employment in a global industry-leading company.
- Annual bonus and a competitive salary aligned with experience and potential.
- Comprehensive benefits package including:
- Private medical care.
- Co-financing of Multisport Card.
- Life and health insurance.
- Opportunities for personal growth and professional development.
- A supportive company culture built on trust, collaboration, and mutual respect.
- Additional benefits such as:
- Co-financing of sports activities.
- Private ...