Aktywne filtry
Lista ofert pracy
Umowa o pracę
Bachelor's degree in Software, Maths, Engineering, or related field. At least 5 years of experience in DevOps or related field. Acquaintance with software development processes and methodologies Experience with continuous integration/delivery tools such as Argo CD or Jenkins would be an advantage Experience with cloud infrastructure platforms such as GCP and AWS . Excellent scripting skills in Bash, Python, Ansible and Terraform Experience with monitoring and logging tools such as Prometheus and... (więcej)
- Collaborate with software developers, quality assurance engineers, and IT professionals to guarantee smooth deployment, automation, and management of software infrastructure.
- Design and implement CI/CD pipelines for multiple software applications and environments.
- Create and maintain monitoring systems to guarantee high availability and performance for software applications.
- Manage and enhance cloud infrastructure such as GCP and AWS.
- Automate software deployment, configuration, and ...
- Bachelor's degree in Software, Maths, Engineering, or related field.
- At least 5 years of experience in DevOps or related field.
- Acquaintance with software development processes and methodologies.
- Experience with continuous integration/delivery tools such as Argo CD or Jenkins would be an advantage.
- Experience with cloud infrastructure platforms such as GCP and AWS.
- Excellent scripting skills in Bash, Python, Ansible, and Terraform.
- Experience with monitoring and logging tools such ...
- Flat structure.
- International projects.
- Fully remote work.
- Online and offline meet-ups.
- Training budget.
- Small teams.
Umowa o pracę
What you’ll need to succeed in this role: At least 5 years of professional programming experience implementing, developing, or maintaining Big Data systems, ideally in a business or enterprise setting. Strong proficiency in Python and SQL for data engineering tasks. Solid understanding of cloud data ecosystems, especially AWS . Familiarity with automation tools such as Airflow, AWS Step Functions , or similar workflow orchestrators. Experience working in Unix environments and using Git for... (więcej)
- Build and scale reliable data pipelines for both real-time and batch processing, using technologies like Airflow and AWS Step Functions.
- Develop automated data workflows that power analytics, reporting, and AI/ML models using services such as S3, Glue, EMR, Lambda, Redshift, and SageMaker.
- Work with diverse data sources, ensuring consistent quality, integrity, and readiness for analysis across platforms.
- Ingest, transform, and manage data from structured and semi-structured sources ...
- At least 5 years of professional programming experience implementing, developing, or maintaining Big Data systems, ideally in a business or enterprise setting.
- Strong proficiency in Python and SQL for data engineering tasks.
- Solid understanding of cloud data ecosystems, especially AWS.
- Familiarity with automation tools such as Airflow, AWS Step Functions, or similar workflow orchestrators.
- Experience working in Unix environments and using Git for version control.
- Excellent ...
- Work in a supportive team of passionate enthusiasts of AI & Big Data.
- Engage with top-tier global enterprises and cutting-edge startups on international projects.
- Enjoy flexible work arrangements, allowing you to work remotely or from modern offices and coworking spaces.
- Accelerate your professional growth through career paths, knowledge-sharing initiatives, language classes, and sponsored training or conferences, including a partnership with Databricks.
- Choose from various employment ...
Umowa o pracę
Poszukujemy osób do wzmocnienia naszego zespołu AI/ML. Pracujemy dla klientów z Polski i zagranicy. Projekty to głównie rozwiązania o RAG, reasoning i agentowość. Przykładowe projekty: Narzędzia do automatycznego przetwarzania informacji w oparciu o RAG Narzędzia do ekstrakcji danych tabelarycznych z konwersacji Narzędzia do wielokrokowego wnioskowania na podstawie danych z dokumentów/CRM Agenci do automatyzacji zadań biurowych Narzędzia do przeszukiwania i podsumowania... (więcej)
- Udział w projektach AI dla klientów polskich i zagranicznych
- Budowanie rozwiązań AI w oparciu o prompty i architektury RAG
- Przygotowywanie wymagań projektowych
- Samodzielne prowadzenie projektów
- Akademicki background w ML/AI
- Dobra znajomość rozwiązań chmurowych (z naciskiem na Azure)
- Dobra znajomość narzędzi LLM/GenAI
- Praktyczna znajomość LangChain - budowa aplikacji opartych o LLM
- Bardzo dobra znajomość języka angielskiego
- Znajomość Synapse, ADF czy Power BI będzie atutem
- Umiejętności: LangChain, Python, SQL, Azure, Langflow, Open webUI, Langfuse, AWS, Airflow, Supabase, Milvus, Docker
- Narzędzia: Agile
- Sport subscription
- Flat structure
- Small teams
- International projects
- In-house trainings
- Modern office
- Startup atmosphere
- No dress code
Umowa o pracę
What you’ll need to succeed in this role: At least 5 years of commercial experience implementing, developing, or maintaining Big Data systems. Strong programming skills in Python : writing a clean code, OOP design. Strong SQL skills, including performance tuning, query optimization, and experience with data warehousing solutions. Experience in designing and implementing data governance and data management processes. Deep expertise in Big Data technologies, including Apache Airflow, Dagster,... (więcej)
- Design and optimize scalable data processing pipelines for both streaming and batch workloads using Big Data technologies such as Databricks, Apache Airflow, and Dagster.
- Architect and implement end-to-end data platforms, ensuring high availability, performance, and reliability.
- Lead the development of CI/CD and MLOps processes to automate deployments, monitoring, and model lifecycle management.
- Develop and maintain applications for aggregating, processing, and analyzing data from ...
- At least 5 years of commercial experience implementing, developing, or maintaining Big Data systems.
- Strong programming skills in Python: writing clean code, OOP design.
- Strong SQL skills, including performance tuning, query optimization, and experience with data warehousing solutions.
- Experience in designing and implementing data governance and data management processes.
- Deep expertise in Big Data technologies, including Apache Airflow, Dagster, Databricks, Spark, DBT, and other ...
- Work in a supportive team of passionate enthusiasts of AI & Big Data.
- Engage with top-tier global enterprises and cutting-edge startups on international projects.
- Enjoy flexible work arrangements, allowing you to work remotely or from modern offices and coworking spaces.
- Accelerate your professional growth through career paths, knowledge-sharing initiatives, language classes, and sponsored training or conferences, including a partnership with Databricks, which offers industry-leading ...
Umowa o pracę
Expert Python engineer wanted Do you have at least 5 years of programming experience, including at least a year with Python? (We mean open source and significant personal projects, too, not just formal jobs.) If so, you have probably written so much Python code in your life that, at times, you want to express your thoughts through list comprehensions. Thanks to your extensive experience, you can solve any problem that gets thrown at you, but that is not enough—you come up with multiple... (więcej)
- Write Python code with passion and pride.
- Solve complex problems and provide multiple solutions.
- Take ownership of code from the first line to deployment.
- Work on a trustless supercluster of performance-proofed GPU-enabled sandboxed docker container runners.
- Collaborate with a small, agile, senior-only team.
- Participate in weekly planning calls and daily meetings as needed.
- At least 5 years of programming experience, including at least 1 year with Python.
- Experience with Docker, Backbone, REST API, and open source projects.
- Ability to work remotely and manage your own time (minimum 30 hours per week).
- Strong problem-solving skills and enthusiasm for coding.
- Remuneration on a B2B contract: 45-70 USD or 190-295 PLN per hour, or 7560-11760 USD or 31920-49560 PLN per month for 40 hours per week.
- Flexible work schedule with automatic rate adjustments based on inflation.
- Coverage for reasonable coworking space expenses.
- Hardware provided according to equipment co-funding policy.
- Multicultural environment with opportunities to use English daily.
- Flexible leave policy.
- Work asynchronously with a focus on work-life balance.
Umowa o pracę
Programming: Minimum of 3-4 years as a data engineer, or in a relevant field Python Proficiency: Advanced experience in Python, particularly in delivering production-grade data pipelines and troubleshooting code-based bugs. Data Skills: Structured approach to data insights Cloud: Familiarity with cloud platforms (preferably Azure) Data Platforms: Experience with Databricks, Snowflake, or similar data platforms Database Skills: Knowledge of relational databases, with proficiency in SQL. Big Data:... (więcej)
- Design, build, and maintain data pipelines using Python
- Collaborate with an international team to develop scalable data solutions
- Conduct in-depth analysis and debugging of system bugs (Tier 2)
- Develop and maintain smart documentation for process consistency, including the creation and refinement of checklists and workflows
- Set up and configure new tenants, collaborating closely with team members to ensure smooth onboarding
- Write integration tests to ensure the quality and ...
- Minimum of 3-4 years as a data engineer, or in a relevant field
- Advanced experience in Python, particularly in delivering production-grade data pipelines and troubleshooting code-based bugs
- Structured approach to data insights
- Familiarity with cloud platforms (preferably Azure)
- Experience with Databricks, Snowflake, or similar data platforms
- Knowledge of relational databases, with proficiency in SQL
- Experience using Apache Spark
- Experience in creating and maintaining structured ...
- International projects
- Completion of certification
- Flexible working hours and remote work possibility
- Active Tech community
- Free coffee
- Modern office
Umowa o pracę
Min. 5 years of experience on a similar position In-depth knowledge and hands-on experience in developing (JVM) and managing Apache Kafka-based solutions Experience in Java development of cloud-native applications, preferably on AWS Familiarity with key Apache Kafka components such as Kafka Streams, Kafka Connect, and Schema Registry Knowledge of data serialization formats and schema management , including Avro, Protobuf, and JSON Strong understanding of asynchronous message processing,... (więcej)
- Design and implement a reliable, scalable Central Messaging Platform leveraging Apache Kafka, fully aligned with enterprise architecture standards to support the organization’s changing requirements.
- Create and maintain additional features and components for the Kafka-based solution.
- Contribute to the deployment and management of the cloud infrastructure supporting the Messaging Platform.
- Establish and maintain resilience and disaster recovery mechanisms.
- Develop comprehensive ...
- Min. 5 years of experience in a similar position.
- In-depth knowledge and hands-on experience in developing (JVM) and managing Apache Kafka-based solutions.
- Experience in Java development of cloud-native applications, preferably on AWS.
- Familiarity with key Apache Kafka components such as Kafka Streams, Kafka Connect, and Schema Registry.
- Knowledge of data serialization formats and schema management, including Avro, Protobuf, and JSON.
- Strong understanding of asynchronous message ...
- Ongoing support from a dedicated agent who cares about your project continuity, contact with clients, necessary formalities, work comfort, and development.
- Career Development Program – advice on career planning based on the latest trends and market needs in IT, including consultations with career agents and mentors.
- Access to 7N Learning & Development – a development and educational platform offering webinars, a library of articles and industry reports, and frequent invitations to both...
Umowa o pracę
7 years in a data engineering role, with hands-on experience in building data processing pipelines, experience in leading the design and implementing of data pipelines and data products, proficiency with GCP services, for large-scale data processing and optimization, extensive experience with Apache Airflow, including DAG creation, triggers, and workflow optimization, knowledge of data partitioning, batch configuration, and performance tuning for terabyte-scale processing, strong Python... (więcej)
- Designing, building, and maintaining data platforms and pipelines
- Mentoring new engineers
- Developing and maintaining data pipelines to ensure seamless data flow from the Loyalty system to the data lake and data warehouse
- Collaborating with data engineers to ensure data engineering best practices are integrated into the development process
- Ensuring data integrity, consistency, and availability across all data systems
- Integrating data from various sources, including transactional ...
- 7 years in a data engineering role
- Hands-on experience in building data processing pipelines
- Experience in leading the design and implementing of data pipelines and data products
- Proficiency with GCP services for large-scale data processing and optimization
- Extensive experience with Apache Airflow, including DAG creation, triggers, and workflow optimization
- Knowledge of data partitioning, batch configuration, and performance tuning for terabyte-scale processing
- Strong Python ...
- Training budget
- Private healthcare
- Multisport
- Integration events
- International projects
- Mental health support
- Referral program
- Modern office
- Canteen
- Free snacks
- Free beverages
- Free tea and coffee
- No dress code
- Playroom
- In-house trainings
- In-house hack days
- Normal atmosphere
Umowa o pracę
Strong expertise in cloud technologies and the Snowflake ecosystem, particularly AWS, Data One Platform, Immuta, Collibra, and cloud security aspects Minimum 5 years of experience in Python programming for building and maintaining data pipelines Advanced knowledge of databases, including query optimization, relational schema design, and MPP using SQL Expertise in data storage technologies, including files, relational databases, MPP, NoSQL, and various data types (structured, unstructured,... (więcej)
- Design and develop data architecture incorporating Snowflake, Immuta, Collibra, and cloud security.
- Implement and optimize ETL/ELT processes, manage raw data, automate workflows, and ensure high performance and reliability of data processing systems.
- Manage data quality and security, monitor quality, and implement metadata management strategies.
- Collaborate with analytics and business teams to identify user needs and deliver comprehensive solutions supporting analytics and reporting.
- ...
- Strong expertise in cloud technologies and the Snowflake ecosystem, particularly AWS, Data One Platform, Immuta, Collibra, and cloud security aspects.
- Minimum 5 years of experience in Python programming for building and maintaining data pipelines.
- Advanced knowledge of databases, including query optimization, relational schema design, and MPP using SQL.
- Expertise in data storage technologies, including files, relational databases, MPP, NoSQL, and various data types (structured, ...
- Sport subscription
- Training budget
- Private healthcare
- Small teams
- International projects
- Free coffee
- Free breakfast
- No dress code
- Modern office
Your role at Dynatrace Dynatrace, the world's number one Software Intelligence monitoring platform, provides answers, not just data, about application performance and end-user experience. We're looking for talented Software Engineers to make our product even better. Your role: - You will develop an application in Java/Spring Boot for managing AWS native cluster and tenant lifecycle, - You will provide automated way of creating and configuring AWS infrastructure, - You will be a part of the whole... (więcej)
- Develop an application in Java/Spring Boot for managing AWS native cluster and tenant lifecycle.
- Provide an automated way of creating and configuring AWS infrastructure.
- Participate in the whole implementation process, starting with talks with Product Managers and Architects through design, coding, and release preparation.
- Dedicate one day per sprint to self-development and mentoring from best-in-class software engineers.
- Work with highly skilled engineers who prioritize teamwork, ...
- Solid 5 years of experience with Java and professional experience in development and architectural design.
- Experience with major cloud providers.
- Experience with test-driven development, clean code, design patterns, etc.
- Experience in developing distributed and multi-tier applications.
- Very good command of English (min. B2).
- Solid foundation in object-oriented programming, data structures, and algorithms.
- Passion for Linux, automation, and cloud infrastructure.
- Dynatrace is a leader in unified observability and security.
- Competitive compensation packages designed to recognize and reward performance.
- Opportunity to work with the largest cloud providers, including AWS, Microsoft, and Google Cloud, and other leading partners worldwide to create strategic alliances.
- Access to cutting-edge technologies, including our own Davis hypermodal AI, to help customers modernize and automate cloud operations, deliver software faster and more securely, and ...
Your role at Dynatrace Dynatrace, the world's number one Software Intelligence monitoring platform, provides answers, not just data, about application performance and end-user experience. We're looking for talented Software Engineers to make our product even better. - You will develop a product that is essentially a distributed debugger running invisibly in complex production environments, delivering maximum value while incurring minimum overhead. - You will develop an application that works... (więcej)
- Develop a product that is essentially a distributed debugger running invisibly in complex production environments, delivering maximum value while incurring minimum overhead.
- Develop an application that works both in multi-cloud and on-premises environments.
- Participate in the whole implementation process, starting with talks with Product Managers and Architects through design, coding, and release preparation.
- Be part of a mature development team with years of experience.
- Dedicate one ...
- A solid foundation in object-oriented programming, data structures, and algorithms.
- Solid 5 years of experience with Java and professional experience in development.
- Experience with test-driven development, clean code, design patterns, etc.
- Open-minded attitude with a willingness to learn new technologies.
- Familiarity with working in an agile environment.
- Team player with a proactive approach.
- Get-things-done attitude.
- Good English communication skills.
- Competitive compensation packages designed to recognize and reward performance.
- Attractive compensation packages and stock purchase options with numerous benefits and advantages.
- Employment contracts only, with a consideration for a hybrid working setup (2/3 days per week in the office).
- Work with the largest cloud providers, including AWS, Microsoft, and Google Cloud, and other leading partners worldwide.
- Utilize cutting-edge technologies, including Davis hypermodal AI, to help ...
Our Purpose Mastercard powers economies and empowers people in 200 countries and territories worldwide. Together with our customers, we're helping build a sustainable economy where everyone can prosper. We support a wide range of digital payments choices, making transactions secure, simple, smart and accessible. Our technology and innovation, partnerships and networks combine to deliver a unique set of products and services that help people, businesses and governments realize their greatest... (więcej)
- Develop and maintain products for Mastercard's Open Banking platform.
- Ensure high-quality connectivity to banks across Europe that is frictionless, safe, and secure.
- Create low-touch solutions for customer onboarding, verification, transaction monitoring, and anti-money laundering operations.
- Work on account information and payment initiation technologies, addressing industry challenges in API security, data accessibility, processing, and analysis.
- Collaborate with a team of skilled ...
- Motivated individuals with a desire to work in a team-oriented environment.
- Experience with services and APIs built in Java and deployed to the cloud (AWS).
- Deep knowledge in relevant areas; willingness to learn is valued.
- Familiarity with modern development standards and tools, including:
- Standard ticketing systems (Jira)
- Version Control Systems (VCS)
- Test automation
- CI/CD pipelines
- Experience with provisioning infrastructure as code, containers (Docker), or interest in ...
- Opportunity to work in an established division with both industry veterans and newcomers.
- Engage in innovative projects at the forefront of open banking technology.
- Collaborate with a diverse team across multiple countries.
We are currently looking for Salesforce Senior DevOps to join our technology team, Requirements: Experience with implementing CI/CD pipelines (Azure DevOps, GitLab, Jenkins, etc.) Relevant experience in any of the following areas: (Salesforce Development, release management, software development, DevOps) Solid understanding of tools and best practices for Continuous Integration & Continuous Deployment Knowledge of Salesforce development lifecycle models Nice to have: Certifications in Salesforce... (więcej)
- Create, automate, and implement improvements of the processes related to releasing Salesforce applications deployments.
- Lead, advocate, and enforce solid CI/CD best DevOps practices.
- Collaborate with development teams, ensuring they follow DevOps procedures and best practices.
- Manage change control process, production backups, and sandbox refreshes while continuously improving release processes.
- Plan and govern best practices for release lifecycle including version controls, branching,...
- Experience with implementing CI/CD pipelines (Azure DevOps, GitLab, Jenkins, etc.)
- Relevant experience in any of the following areas: Salesforce Development, release management, software development, DevOps.
- Solid understanding of tools and best practices for Continuous Integration & Continuous Deployment.
- Knowledge of Salesforce development lifecycle models.
- Nice to have:
- Certifications in Salesforce such as Salesforce Administrator / Advanced Administrator, Salesforce Platform ...
- Enterprise, versatile, and challenging projects with plenty of opportunities to grow and develop your skills.
- Rapid knowledge development thanks to cooperation with the best team of experienced Salesforce Architects and Developers.
- Opportunity to be a part of the greatest network of Salesforce experts.
- Be a part of Deloitte global projects, cooperating with peers all around the world.
- Access to major learning platforms: Focus on Force, Udemy, LinkedIn Learning.
- Fully supported ...
Our Purpose Mastercard powers economies and empowers people in 200 countries and territories worldwide. Together with our customers, we're helping build a sustainable economy where everyone can prosper. We support a wide range of digital payments choices, making transactions secure, simple, smart and accessible. Our technology and innovation, partnerships and networks combine to deliver a unique set of products and services that help people, businesses and governments realize their greatest... (więcej)
- Development and maintenance of Mastercard's enterprise KYC & AML platform.
- Secure high-quality connectivity to banks across Europe.
- Create low-touch solutions for customer onboarding and verification, transaction monitoring, and anti-money laundering operations.
- Work on account information and payment initiation technologies.
- Solve industry problems in API security, data accessibility, processing, and analysis.
- Collaborate with a team of skilled individuals across different ...
- Motivated individuals with a team-oriented mindset.
- Experience in the Microsoft domain, specifically APIs built in C# with ASP.NET deployed to the cloud (Azure, AWS).
- Knowledge of modern development standards and tools (Azure DevOps, Jira, VCS, test automation, CI/CD pipelines).
- Experience with provisioning infrastructure as code (Terraform) and interest in computer security (Certificates, HTTPS) is a plus.
- Prior experience in regulated industries (financial, medical, etc.) or handling...
- Opportunity to work with an established division of industry veterans and newcomers.
- Engage in a collaborative team environment.
- Continuous learning and development opportunities.
What will help you succeed Strong understanding of modern C++ Knowledge of Ruby and it’s interpreters is a plus Knowledge of Ruby native extensions and FFI is a plus. Experience publishing Ruby Gems is a plus. Ability to focus on details but keep the mind open for the whole ecosystem Team player and eager to learn innovative technologies Independent design and implementation of new features and feature ownership Intensive cooperation with local and international development teams Your role at... (więcej)
- Development of performance and availability monitoring solutions for Ruby applications
- Working primarily in C++
- Analyzing and understanding the inner workings of common Ruby, Ruby apps, and gems
- Gazing beyond common frameworks and gems with a deep understanding of Ruby
- Independent design and implementation of new features and feature ownership
- Intensive cooperation with local and international development teams
- Strong understanding of modern C++
- Knowledge of Ruby and its interpreters is a plus
- Knowledge of Ruby native extensions and FFI is a plus
- Experience publishing Ruby Gems is a plus
- Ability to focus on details but keep the mind open for the whole ecosystem
- Team player and eager to learn innovative technologies
- Employment contracts (PL: Umowa o pracę / UoP)
- Hybrid working setup with 2/3 days per week in the office
- Permanent role (not a B2B contract)
- Attractive compensation packages and stock purchase options
- Base salary range starting from 16.000 PLN gross per month (depending on seniority level)