Assist with quality assurance of activities and educate less experienced colleagues in related areas.Liaises with staff for the development of system enhancements to overcome known problems or further fulfil user requirements Be responsible for the adoption of new software releases from systems development staff or software suppliers and ensuring the continued support of the service thereafter Act as Service Owner for regional service, Global or Top Service By Revenue Be assigned specific project related tasks by Project Manager APPLICATION YOU WILL USE: Unix / Linux commandShell ScriptingOracle SQL WHAT YOU SHOULD HAVE: Degree in Computer Science, Information Systems or equivalent experience 3–5 years of working experience in performing L2 application support of large enterprise application Must have hands on knowledge on scripting languages like Java script and Shell Script.
This includes the ability to work unsupervised, under pressure and meet deadlines • Creative with strong commitment to quality and excellence, and a continuous improvement mindset • Communication and time-management skills • Strong analytical skills and efficient problem solving • You are educated on master degree level in IT Security, Computer Science or equivalent, fluent in English WHAT IS THE PLUS POINT (S): • Certifications such as SEC545 (SANS), CCSP(ISC2), CISSP, CCSK(CSA), Microsoft Certified Azure Security or similar • DevSecOps experience related to application deployments on multi cloud • Strong experience on cloud platforms such as Google, Amazon and Azure WHAT YOU WILL GET FROM US: • Great team of IT professionals with global working exposure • On-going professional and technical training and certifications • Global internal job opportunities available within DPDHL • A multicultural environment • Meal Card and Flexible Benefits – customized according to individual needs • Choose any day for your vacation from earned public holiday (Saturday and ad hoc) • Smart casual dress code • Company Outpatient Medical • Home office possibilities Sounds good?
WHAT YOU WILL DO •Role involves both Architecting, implementation planning and executing with some measure of hands-on work •Set up Enterprise monitoring framework and governance •Research on emerging related technologies in support of increased cost effectiveness and flexibility •Product selection inclusive of PoC & RFI/RFP support •Providing strategic directions for standardizing of monitoring landscape and efficient use of resources for proactive application and infrastructure monitoring •Establish and leverage key strategic vendor relationships •Develop and maintain build standards of related services •Design and implement automation initiatives to optimize the current state •Form and lead global virtual teams creating functional touch points with both architecture and operation teams enabling the on-boarding of new product/technologies to improve infrastructure supporting the global strategy •Deliver complex Infrastructure Projects Implementation •Infrastructure Problem Management - provide 3rd level problem management support for relevant Tools and Monitoring services APPLICATION YOU WILL USE •Microfocus OML •Splunk •Dynatrace •SCOM •Topaz •IQSonar •Microfocus Sitescope •TeamQuest •IBM Tivoli WHAT YOU SHOULD HAVE •Degree in Computer Science, Information Systems or equivalent experience •Minimum 10 years of experience in IT or related field with a minimum of 5 of those years focusing on enterprise Tools and Monitoring •End-to-end product lifecycle management from strategic planning to tactical activities •Good knowledge of Operating Systems, System Management, Hardware, Big Data, Automation tools and virtualization technologies, interoperation of all these technologies as applicable •Demonstrated skill in critical thinking and problem-solving methods •Demonstrable experience designing highly available, highly scalable production systems •Exposure to Multi cloud infrastructure •Vision for information delivery and management and driving execution of the roadmap, including enterprise data architecture, big data, analytics and data management.
What You’ll Do Drive Measurable Business ImpactIndependently lead AI and analytics initiatives that generate tangible business valueActively contribute to and influence the company’s strategic directionApply Advanced Analytics & AIDevelop and apply advanced statistical methods in an agile environmentWork on business-critical questions using: Regression models, Time series analysis and Machine learning & AI algorithmsCollaborate cross-functionally or drive initiatives independentlyTurn Data into ActionDesign and execute analyses on large datasetsTranslate findings into clear, actionable recommendationsWork across the full spectrum — from Excel-based analysis to deep learning models Build Scalable AI SolutionsDevelop and own customized Data Science and AI solutionsWork with SQL on Azure and Snowflake platforms, Python is nice to haveLead projects end-to-end: from Proof of Concept (PoC) to fully operational production modelsCreate audience-tailored presentationsTranslate complex analytical insights into clear business languageDeliver compelling data storytelling to support decision-making at all levels What makes you stand out Minimum 3 years of experience in Data Science, AI, Advanced Analytics, or similar rolesProven ability to generate measurable business value through data-driven solutionsStrong expertise in statistical modeling and machine learning techniquesHands-on experience with: Python, SQL, Azure and SnowflakeExperience building scalable models from PoC to productionStrong communication and stakeholder management skillsAbility to explain complex topics to both technical and non-technical audiencesBachelor’s or Master’s degree in Finance, Statistics, Computer Science, Mathematics, or a related quantitative field We are looking forward to your application and to applicants who enrich our diverse culture!
As part of our team, you will take on the following responsibilities: You develop and maintain cloud‑based data pipelines and data products specifically for the Finance domain, built on our Snowflake database.You integrate financial and transactional data from various sources into our central Data Platform, optimize ETL processes, and ensure high data quality.You design and build new DataFlows and DataSets for Finance use cases and create as well as manage the corresponding Power BI reports.You collaborate closely with Finance Product Owners, Data Scientists, and Business Units to deliver meaningful analytical data products.You take ownership of data governance for all Finance data products and actively drive their further development.You support the modernization of our Finance data architecture and contribute to the migration towards cloud‑ and big‑data‑based technologies. What makes you stand out You hold a degree in Computer Science, Business Informatics, or a comparable qualification.You have strong expertise in SQL databases, including data modeling, table design, and data querying — ideally with experience in Snowflake.You bring experience in developing and integrating data products and are familiar with Azure Data Factory, Python, and modern cloud technologies (preferably Azure).You have hands‑on experience creating meaningful Power BI reports; knowledge of CI/CD or container technologies (e.g., Docker, Kubernetes) is a plus.You work analytically, communicate effectively, collaborate well in teams, and demonstrate a strong hands‑on mentality.You are fluent in English and have very good German skills; you also have a passion for financial databases and enjoy exploring new topics.
As part of our team, you will take on the following responsibilities: You are the driving force that transforms innovative AI/ML concepts from the lab environment into robust, scalable products — ensuring prototypes evolve into stable, production‑ready systems.You operationalize the entire ML lifecycle, developing automated CI/CD pipelines, taking ownership of deploying models into production, and continuously expanding our resilient MLOps infrastructure.You implement algorithms and complex feature transformations efficiently within production systems, optimizing them for latency, throughput, and maximum stability.You build scalable data pipelines and feature stores that ensure reliable, consistent data supply for both training and serving — online and offline.You establish professional monitoring mechanisms such as drift detection, automated retraining, and validation processes to guarantee long‑term model quality in live operations.You collaborate closely with Data Scientists, MLOps teams, and IT Infrastructure teams, ensuring that all production‑grade AI/ML solutions follow software engineering best practices and comply with security, privacy, and regulatory requirements. What makes you stand out You hold a master’s degree in Computer Science, Software Engineering, Mathematics, Statistics, or a related quantitative field.You have at least 3 years of professional experience in ML Engineering or MLOps, with a proven track record of building, deploying, and operating tailored AI/ML solutions in production.You bring deep expertise in Python and SQL, hands‑on experience with cloud platforms (ideally Azure), and familiarity with data warehouses such as Snowflake; foundational knowledge of Infrastructure‑as‑Code tools like Terraform is a plus.You are proficient with CI/CD tools (e.g., Azure DevOps), containerization technologies (Docker/Kubernetes), and apply modern software design patterns confidently.You thrive in a dynamic, agile, and innovative environment, working effectively both independently and as part of a team.You possess excellent English communication skills, both written and spoken; German language skills are considered a plus.
What makes you stand out You hold a Master’s degree in Economics, Mathematics, Statistics, Computer Science, or another related quantitative field.You bring at least 3 years of professional Data Science experience and have a proven track record of bringing models from the lab into real‑world production use cases.You are proficient in Python and SQL and have hands‑on experience building solutions on Azure and working with Snowflake.You have a deep understanding of statistical methods and apply them effectively in an agile, fast‑paced environment.You think in products, not experiments – model versioning, testing, and performance monitoring are second nature to you.You communicate complex topics clearly and with impact, presenting to different audiences confidently; you are fluent in English, and German skills are a plus.
Delight customers by providing outstanding technical support, service, and product selection assistance. Responsible for maintaining company property (company car, computer, office equipment, customer demo/loaner products, consignment stock, etc.) that is in his/her control, and or at his/her customer site.
Principal Accountabilities: Collaboration in projects of the European Data Science & Advanced Analytics Team.Concept, design, development and execution of complex innovative AI/Machine Learning solutions as well as execution and implementation of concept studies using advanced statistical methods.Development of deep learning models for structured medical concept extraction from unstructured data.Productionalization of machine learning algorithms in Big Data platforms.Application of modern data mining and machine learning techniques in connection with Healthcare Big Data to identify complex relationships and link heterogeneous data sources.Advanced usage of Large Language Models for summarization, chatbot, entity extraction etc.Develop foundational Deep Learning Models for assets and patients.Builds and trains new production grade algorithms that can learn from complex, high dimensional data to uncover patterns from which machine learning models and applications can be developed. Our Ideal Candidate Will Have: Master’s degree in Computer Science, Mathematics/Statistics, Economics/Econometrics or related field.Substantial years of professional experience in quantitative data analysis or PhD with at least 1 year of relevant professional experience with research in machine learning algorithms.Very good knowledge and in depth understanding of Machine Learning methods, both classical and deep learning models.Relevant experience with Natural Language Processing (NLP) models for extracting structured concepts from unstructured free text, including the design, training, and evaluation of information‑extraction pipelines.Very strong technical capability in Python, SQL, Hadoop ecosystem.Experience applying AI/Machine Learning methods to business questions.Very good knowledge of the higher statistical and econometric methods in theory and practice.Experience with handling Big Data.Ability to write clean, reusable, production-level codeExcellent communication skills (written and oral) including technical aspects of a project, ability to develop usable documentation, results interpretation and business recommendations.Strong analytic mindset and logical thinking capability, strong QC mindset.Knowledge of pharmaceutical market and experience with pharmaceutical data (medical, hospital, pharmacy, claims data) would be a plus, but not a must.Self-responsible for managing projects.Fluency in German & English.