Analyze system requirements and assess the feasibility of testing in simulated environments as well as in vehicles Design and develop test cases for system requirements, particularly for IVI systems including navigation, telematics, and streaming services Execute integrated system tests on test benches and in-vehicle environments Take ownership of key test milestones, such as quality audits and final release gate checks for new vehicle platforms Collaborate closely with Tier-1 suppliers and coordinate activities with our R&D headquarters in Korea Track issues and manage bug fixes using JIRA, ensuring alignment with vehicle line timelines and software release plan Review and challenge technical solutions with a focus on system performance and market competitiveness Identify improvement opportunities based on European market trends and actively support their implementation Bachelor’s or Master’s degree in Computer Science, Engineering, Electrical Engineering, or a related field Practical experience conducting driving tests within the EU region, ideally in the area of automotive infotainment validation Experience in system validation, preferably covering navigation, connected car services, and streaming applications Familiarity with configuration management and version control tools Basic knowledge of data analytics is an advantage Strong planning and organizational skills with a structured working approach High level of flexibility and motivation to work in an international and cross-cultural environment Very good command of the English language, both written and spoken Willingness to travel and conduct driving tests; valid European driving license (Category B) required Pleasant working atmosphere Challenging and varied tasks in a promising and innovative industry Dynamic and innovative market environment A highly motivated team and an open communication styleAngenehmes Arbeitsklima Ihr Kontakt Ansprechpartner Annika Chrysant Referenznummer 858521/1 Kontakt aufnehmen E-Mail: annika.chrysant@hays.de Anstellungsart Anstellung bei der Hays Professional Solutions GmbH
Your assignments Producing designs, both initial outlines and full plans, of sewerage, water treatment and flood defence structures such as pump systems and pipe networksPresenting project details and technical information to colleagues and clientsWriting reportsManaging project budgetsKeeping up to date with changes in regulatory legislation and guidelinesWriting and advertising tender documents and managing contractsLiaising with clients, contractors, government agencies, local authorities and suppliersSupervising local staff and site workersUsing a variety of specialist computer applications/simulation softwareMaintain effective relationships with members of all divisions and departments responsible for performing services related to the projectPrepare construction cost estimates, perform construction monitoring and coordinate field activities Your profile University degree (M.Sc.) in civil engineer, mechanical engineer, process or environmental engineer or other suitable qualification Experience in the preparation of master plans, feasibility studies, tender documents and construction planning, and/or experience with construction supervision, project management and/or commissioning of water supply and/or wastewater treatment facilities.Preferably 5 years of professional experience in international consulting/engineering in the field of urban water management, preferably in projects financed by bilateral and multilateral donor organisations and development banksStructured approach to work a high degree of flexibility, initiative, persuasiveness and negotiation skills as well as the ability to work in a team Willingness to travel extensively Good knowledge of written and spoken English and preferably an additional business language (French, Spanish, etc.)
YOUR TASKS: Design, develop and deploy digital solutions ensuring the software development life cycle in an agile setup Develop solutions on a leading-edge cloud based platform for managing and analyzing large datasets Create technical documentation Analyze and decompose business requirements into technical functionalities Produce clean and efficient code based on business requirements and specifications Create Notebooks, pipelines and workflows in SCALA or Python to ingest, process and serve data in our platform Be a technical lead for junior and external developers Be a part of the continuous improvement of Nordex’ development processes by participating in retrospectives and proposing optimizations YOUR PROFILE: Technical degree in Computer Science, Software Engineering or comparable Experience or certification in Databricks Fluent English At least 3 years of proven experience Availability to travel YOUR BENEFITS: In addition to the opportunity to make our world a little more sustainable, we offer you: *Some offers may vary by location. ** Hybrid working in accordance with the company's internal policy.
What makes you stand out Required Qualifications: You hold a degree in Computer Science or a related field and have 3+ years of experience in agile software development, with knowledge of the full engineering lifecycle.You have hands-on experience with Large Language Models (LLMs), Retrieval-Augmented Generation (RAG), MCP servers, agentic AI, and other advanced AI solutions, including practical experience in building and deploying such systems.You are familiar with AI cloud platforms such as Azure AI Foundry and/or conversational AI platforms (e.g., Cognigy), have hands-on DevOps skills ideally in the Azure Cloud (ADO), and bring basic knowledge of cloud technologies, preferably Microsoft Azure.You are familiar with programming fundamentals in at least one high‑level language such as Python, Java, or TypeScript, ideally developing microservices using frameworks such as Flask, Django, or Spring (Java).You communicate clearly and effectively in English and German.
YOUR TASKS: Design, develop and deploy digital solutions ensuring the software development life cycle in an agile setup Develop solutions on a leading-edge cloud based platform for managing and analyzing large datasets Create technical documentation Analyze and decompose business requirements into technical functionalities Produce clean and efficient code based on business requirements and specifications Create Notebooks, pipelines and workflows in SCALA or Python to ingest, process and serve data in our platform Be a technical lead for junior and external developers Be a part of the continuous improvement of Nordex’ development processes by participating in retrospectives and proposing optimizations YOUR PROFILE: Technical degree in Computer Science, Software Engineering or comparable Experience or certification in Databricks Fluent English At least 3 years of proven experience Availability to travel YOUR BENEFITS: In addition to the opportunity to make our world a little more sustainable, we offer you: *Some offers may vary by location. ** Hybrid working in accordance with the company's internal policy.
RAG-Anwendungen, Agenten oder Tooling) in stabile, produktive Services wie APIs, Backends oder Worker – inklusive Architektur, Error Handling und technischer Dokumentation Aufbau zuverlässiger Daten- und Dokumentenpipelines (Ingestion, Transformation, Indexing/Embeddings, Retrieval) als zentrale Bausteine für produktionsreife KI-Lösungen Entwicklung und Etablierung umfassender Test- und Qualitätsstrategien (Unit-, Integrations- und End-to-End-Tests) sowie LLM-spezifischer Evaluations- und Regressionsmechanismen (Eval-Sets, Guardrails, Versionierung) Aufbau und Betrieb von CI/CD-Pipelines, Umgebungen (Dev/Test/Prod) und Release-Prozessen für zuverlässige Bereitstellung GenAI-basierter Services Implementierung moderner Observability (Logging, Tracing, Metrics, Dashboards, Alerts) sowie Umsetzung von Security- und Governance-Standards (Identity/RBAC, Secrets, Policies) Ihr Profil Abgeschlossenes Studium im Bereich Informatik, Wirtschaftsinformatik oder vergleichbare Qualifikation Mehrjährige Erfahrung im Software Engineering, idealerweise mit Schwerpunkt Python; zusätzliche Kenntnisse in TypeScript, Java oder C# sind von Vorteil Erfahrung in der produktiven Entwicklung von Backend-Services einschließlich Tests, CI/CD und Integrationslandschaften Fundiertes technisches Verständnis für Betrieb und Stabilität produktiver Systeme – insbesondere Monitoring, Alerting, Incident-Readiness sowie Performance- und Kostenbewusstsein Praxis im Umgang mit Cloud-Technologien, bevorzugt Microsoft Azure (Identity, Secrets Management, Storage/Compute, Netzwerkgrundlagen) Strukturierte, qualitätsorientierte Arbeitsweise sowie der Anspruch, hochwertige und skalierbare GenAI-Lösungen in Produktion zu bringen Einsatzort: Dortmund ǀ Nordrhein-Westfalen
As part of our team, you will take on the following responsibilities: You develop and maintain cloud‑based data pipelines and data products specifically for the Finance domain, built on our Snowflake database.You integrate financial and transactional data from various sources into our central Data Platform, optimize ETL processes, and ensure high data quality.You design and build new DataFlows and DataSets for Finance use cases and create as well as manage the corresponding Power BI reports.You collaborate closely with Finance Product Owners, Data Scientists, and Business Units to deliver meaningful analytical data products.You take ownership of data governance for all Finance data products and actively drive their further development.You support the modernization of our Finance data architecture and contribute to the migration towards cloud‑ and big‑data‑based technologies. What makes you stand out You hold a degree in Computer Science, Business Informatics, or a comparable qualification.You have strong expertise in SQL databases, including data modeling, table design, and data querying — ideally with experience in Snowflake.You bring experience in developing and integrating data products and are familiar with Azure Data Factory, Python, and modern cloud technologies (preferably Azure).You have hands‑on experience creating meaningful Power BI reports; knowledge of CI/CD or container technologies (e.g., Docker, Kubernetes) is a plus.You work analytically, communicate effectively, collaborate well in teams, and demonstrate a strong hands‑on mentality.You are fluent in English and have very good German skills; you also have a passion for financial databases and enjoy exploring new topics.
As part of our team, you will take on the following responsibilities: You are the driving force that transforms innovative AI/ML concepts from the lab environment into robust, scalable products — ensuring prototypes evolve into stable, production‑ready systems.You operationalize the entire ML lifecycle, developing automated CI/CD pipelines, taking ownership of deploying models into production, and continuously expanding our resilient MLOps infrastructure.You implement algorithms and complex feature transformations efficiently within production systems, optimizing them for latency, throughput, and maximum stability.You build scalable data pipelines and feature stores that ensure reliable, consistent data supply for both training and serving — online and offline.You establish professional monitoring mechanisms such as drift detection, automated retraining, and validation processes to guarantee long‑term model quality in live operations.You collaborate closely with Data Scientists, MLOps teams, and IT Infrastructure teams, ensuring that all production‑grade AI/ML solutions follow software engineering best practices and comply with security, privacy, and regulatory requirements. What makes you stand out You hold a master’s degree in Computer Science, Software Engineering, Mathematics, Statistics, or a related quantitative field.You have at least 3 years of professional experience in ML Engineering or MLOps, with a proven track record of building, deploying, and operating tailored AI/ML solutions in production.You bring deep expertise in Python and SQL, hands‑on experience with cloud platforms (ideally Azure), and familiarity with data warehouses such as Snowflake; foundational knowledge of Infrastructure‑as‑Code tools like Terraform is a plus.You are proficient with CI/CD tools (e.g., Azure DevOps), containerization technologies (Docker/Kubernetes), and apply modern software design patterns confidently.You thrive in a dynamic, agile, and innovative environment, working effectively both independently and as part of a team.You possess excellent English communication skills, both written and spoken; German language skills are considered a plus.
Unsere Technologien verstecken sich in vielen Bereichen des Alltags und begegnen Ihnen öfter als Sie denken: beispielsweise in Smartphones, auf Glasfassaden, in Solar- und PV-Anlagen oder auch in Mikrochips für die Computer- und Kommunikationsbranche. Deine Aufgaben Smartphones, Smart Homes, Smart Factorys – die Elektrotechnik verändert unsere Welt nicht nur im privaten Bereich, sondern stellt auch die Industrie auf den Kopf.
. • Leverage knowledge of Engineer to Order (ETO) and other Product Lifecycle Management (PLM) solutions to ensure seamless E2E process integration. • Bachelor’s degree in Computer Science, Information Technology, Engineering, or a related field. • Long years of experience as an SAP Consultant, with a focus on S/4 HANA, SAP PP, QM, Serialization, and Batch Management. • Proven experience in SAP project roll-out implementations. • Strong analytical and problem-solving skills, with the ability to think critically and strategically. • Excellent communication and interpersonal skills, with the ability to work collaboratively across teams. • Experience with Serialization and Batch Management in an S/4 HANA environment is highly desirable. • Knowledge of Engineer to Order (ETO) processes and other PLM solutions is a significant advantage. ##5,472000004
As part of our team, you will take on the following responsibilities: You are responsible for the stable operation and continuous development of our Microsoft 365 environment, including Intune, Teams, SharePoint, OneDrive, Copilot, and related services.You actively contribute to the introduction of new features, technologies, and use cases within the Modern Workplace environment and support them from planning through to operational handover.You automate and standardize processes using PowerShell and Microsoft Graph to ensure efficiency, quality, and scalability.You ensure that our systems consistently comply with applicable security, compliance, and governance requirements.You work closely with internal stakeholders, support rollouts, and contribute to high user adoption through training, documentation, and enablement activities. What makes you stand out You hold a degree in computer science or have a comparable technical qualification.You have experience in operating and further developing Microsoft 365 services and feel confident in modern workplace environments.You bring solid knowledge of Entra ID (Azure AD), Intune, Conditional Access, Microsoft Teams, and SharePoint Online.You have hands‑on experience in PowerShell scripting and ideally in working with Microsoft Graph.You are interested in topics such as automation, security, governance and are motivated to further develop your skills in these areas.You work in a structured, self‑reliant manner and enjoy collaborating in a team.You have very good German and English skills, both written and spoken.
Who you are 5+ years of professional experience in Platform Engineering, DevOps, or Site Reliability Engineering (SRE), with a significant focus on cloud infrastructure Fluency in scripting languages (e.g., Python, Go, Bash) for system automation, tooling development, and operational tasks Deep expertise in managing and scaling production workloads within a major public cloud provider (e.g., AWS, Azure, or GCP), including strong familiarity with core services like Compute, Networking, Identity & Access Management (IAM), and Managed Database Proven mastery of Infrastructure-as-Code (IaC) using AWS CloudFormation and/or Terraform in complex, multi-account environments Demonstrated experience designing, implementing, and maintaining robust CI/CD pipelines Solid knowledge of monitoring and logging solutions Excellent communication and documentation skills, with the ability to articulate complex technical issues to technical stakeholders Benefits Hybrid working Sportkurse Freier Zutritt zur code.talks Exklusive Mitarbeiter Rabatte Kostenlose Getränke Sprachkurse Kostenloser Laracasts Account Company Events Relocation Unterstützung Mobilitätszuschlag State-of-the-art Technologien Zentrale Lage Betriebliche Altersvorsorge Weiterbildungs- angebote Hunde erlaubt AY Academy Feedbackkultur Firmenfahrrad YOU ARE THE CORE OF ABOUT YOU.
Dein Profil als Integration Specialist (m/w/d) am Standort Erlangen Proven experience in cloud migration projects, especially GCP to Azure. Degree in Computer Science or equivalent qualification. Strong knowledge of modern development tools (e.g., MS Visual Studio). Proficiency in programming languages (C#, Python, SQL, PowerShell) and interface technologies (REST, JSON).
Key Responsibilities: ·Provides specialized administrative support in the development, implementation, and marketing of our two HR development programs ·Serves as a central point of contact between prospective employees, vendors, staff, other departments, and/or external constituencies on day-to-day operational, and administrative matters; assists with meetings, special projects, and/or general problem resolution ·Coordinates activities and administration of program objectives; this requires engaging internal and external stakeholders ·Monitors and administers program/project expenses; may develop or participate in the development of funding proposals for the program ·Writes, edits, and coordinates development of course catalogs, promotional materials, educational materials, training manuals, newsletters, and/or brochures, as appropriate to the program ·Maintains program/project records, researches information and gathers and computes various data; prepares special and/or one-time reports, summaries, or replies to inquiries, selecting relevant data from a variety of sources.
Maintain and troubleshoot data integration pipelines to ensure stable data flow into AI and analytics systemsSupport model development by assisting with training, validation, and optimization of machine learning workflowsConduct data analysis to extract insights and provide clear reports supporting R&D research questionsSolve technical challenges related to data access, pipeline performance, and software limitationsEnsure continuity of ongoing projects by aligning closely with the core team and delivering on timelinesPerform image analysis and prepare datasets required for scientific and ML use casesManage and improve ETL processes to ensure data quality, structure, and availabilityDocument workflows, pipeline changes, and analytical steps to ensure clarity and reproducibility Academic background in computer science, data science, engineering, or a related quantitative fieldStrong proficiency in Python with expertise in scientific and analytical librariesSkilled in SQL and working with relational databasesUnderstanding of ETL concepts and practical experience working with data pipelinesSolid foundation in machine learning principles and model lifecycleAbility to perform image analysis for scientific or research applicationsStrong communication and interpersonal skills with the ability to collaborate in a technical teamIndependent, structured problem-solver with a commitment to clear documentation and FAIR data practices Opportunity to contribute directly to active R&D projects with immediate real-world impactHands-on involvement in AI, machine learning, and data integration challenges in a scientific environmentClose collaboration with a small, highly skilled technical team Ihr Kontakt Referenznummer 863771/1 Kontakt aufnehmen Telefon:+41 44 225 50 00 E-Mail: positionen@hays.ch Anstellungsart Freiberuflich für ein Projekt
Maintain and troubleshoot data integration pipelines to ensure stable data flow into AI and analytics systems Support model development by assisting with training, validation, and optimization of machine learning workflows Conduct data analysis to extract insights and provide clear reports supporting R&D research questions Solve technical challenges related to data access, pipeline performance, and software limitations Ensure continuity of ongoing projects by aligning closely with the core team and delivering on timelines Perform image analysis and prepare datasets required for scientific and ML use cases Manage and improve ETL processes to ensure data quality, structure, and availability Document workflows, pipeline changes, and analytical steps to ensure clarity and reproducibility Academic background in computer science, data science, engineering, or a related quantitative field Strong proficiency in Python with expertise in scientific and analytical libraries Skilled in SQL and working with relational databases Understanding of ETL concepts and practical experience working with data pipelines Solid foundation in machine learning principles and model lifecycle Ability to perform image analysis for scientific or research applications Strong communication and interpersonal skills with the ability to collaborate in a technical team Independent, structured problem-solver with a commitment to clear documentation and FAIR data practices Opportunity to contribute directly to active R&D projects with immediate real-world impact Hands-on involvement in AI, machine learning, and data integration challenges in a scientific environment Close collaboration with a small, highly skilled technical team Ihr Kontakt Referenznummer 863771/1 Kontakt aufnehmen Telefon:+41 44 225 50 00 E-Mail: positionen@hays.ch Anstellungsart Freiberuflich für ein Projekt
. • Able to meet utilization targets and standard times. • Accurately enters Database/SAP transactions in the computer. • Maintains daily records, including labor sheets and routers. • Responsible for observing and following all Environmental, Health and Safety rules and procedures.
. • Able to meet utilization targets and standard times. • Accurately enters Database/SAP transactions in the computer. • Maintains daily records, including labor sheets and routers. • Responsible for observing and following all Environmental, Health and Safety rules and procedures.
. • Able to meet utilization targets and standard times. • Accurately enters Database/SAP transactions in the computer. • Maintains daily records, including labor sheets and routers. • Responsible for observing and following all Environmental, Health and Safety rules and procedures.
. • Meets utilization targets and standard times. • Accurately enters Database/SAP transactions in the computer. • Maintains daily records, including labor sheets and routers. • Responsible for observing and following all Environmental, Health and Safety rules and procedures.
We will need you to possess the following essential qualities and skills: Recognized Degree in IT; Computer Science; Software Engineering, Finance, Business Information Technology or relevant fieldsAbove 3.0 CGPA / Second Class upper (Hons)Less than 1-year of working experienceHighly self-motivated team player with good analytical and conceptual thinking, results-driven with initiative and good negotiation skillsAll rounded individual with a deep interest in IT industry who enjoys challenges and strive to work with professionals in IT industry globallyAble to quickly absorb professional knowledge, good in problem-solving, contribution and leadershipEagerness to take responsibility over complex processes and ability to multitaskExcellent verbal and written English WHAT IS THE PLUS POINT (S) Experience in self-initiated projectsBasic programming skills in any programming language WHAT YOU WILL GET FROM US Great team of IT professionals with global working exposureMeal Card and Flexible Benefits – customized according to individual needsOn-going professional and technical training and certificationsA multicultural environment in modern officesChoose any day for your vacation from earned public holiday (Saturday and ad hoc)Smart casual everydayGlobal internal job opportunities available within DPDHLUnlimited Outpatient MedicalHome office possibilities If you are up to the challenge, contact us today and we will ensure that you will have an experience of a lifetime.
. • Meets utilization targets and standard times. • Accurately enters Database/SAP transactions in the computer. • Maintains daily records, including labor sheets and routers. • Responsible for observing and following all Environmental, Health and Safety rules and procedures.
You will work in an internationally oriented market segment with a broad customer base of OEMs. YOUR QUALIFICATIONS Completed technical degree in computer science/electrical engineering/mechatronics Professional experience in a comparable environment and practical experience in the analysis and specification of system requirements, preferably in the automotive industry using DOORS and RhapsodyProficient use of the SysML modeling language and experience in the development of ASIL C/D systems Systematic, process-oriented approach (SPICE, V-model) and very good technical understanding Experience in the field of digital control engineering and power electronics is desirable Experience with System-FMEA and System-FTA is desirable Excellent communication skills in English Enthusiasm and a desire to further develop oneself and the HV DC/DC and OBC products WHAT WE OFFERMobile Working and flexible working time modelsExtensive career and further training opportunities Sports and health program HELLA in Motion as well as partner of the company fitness network WELLPASSDiscounted Germany ticket for bus and trainFree coffee and water, company restaurants and cafésBicycle leasing and numerous employee benefitsCompany pension plan with HELLA allowanceFamily support: family service and advice on caring for relativesEmployee events Even if you do not meet all our requirements, do not hesitate to apply to us, because the further development of our employees is very important to us and opens up a wide range of opportunities for you in our company.
Main Tasks and Key Responsibilities Overall goals / Typical measures Seeks and prospects for BC targets to win new customers generally in the 30k-500k Euro range Net Sales per annumPlans and manages medium to large - sized Business CustomersBuilds rapport and trust with customers by being informed about customer’s business and the marketAssesses the type and size of customer needsRecommends solutions based on customer needs by using industry knowledgeIs responsible for closing business by connecting a customer’s needs to a DHL solution by offering value to the clients supply chainSupports customer growth by conducting joint visits with Product, TL and organizing workshops inviting customers to share information on updated regulations, products, etcUses networks within the various Sales channels within DP DHL to collaborate on customers, marketing strategies and offers a full supply chain of services to service customer needsCollects relevant customer information for the RFI/RFP/RFQ and prepares documents for customer implementation in order to ensure proper operational handover and implementation to meet customer expectations (SLA’s & SOP’s)Is a master user of DGF CRMTransfers SC with high value potential to key account Sales channel Job Requirements Computer Literate – All MS Programsand willing to learn additional computer skillsAbility to work in a team environment and transfer of knowledge to team membersAbility to work within time constraints, make own decisions and advise Client of logistics solutionsPunctuality and time management skills are essential elements of this position.Ability to work under extreme pressure with minimal supervision.Ability to communicate effectively on all levelsAbility to use initiative and be a self-starterCreativity and problem-solving capabilitiesOutstanding Customer relationship skills a necessityPunctuality and time management skills are essential elements of this position.Must have previous freight experience Skills / Qualifications Key capabilities / Competencies Competence Competency segment ‘Business’ Analysis:Breaks down a problem, situation or process into its component parts, separates the main issues from side-issues, understands the nature of parts and their relationship to one another.
This includes the ability to work unsupervised, under pressure and meet deadlines • Creative with strong commitment to quality and excellence, and a continuous improvement mindset • Communication and time-management skills • Strong analytical skills and efficient problem solving • You are educated on master degree level in IT Security, Computer Science or equivalent, fluent in English WHAT IS THE PLUS POINT (S): • Certifications such as SEC545 (SANS), CCSP(ISC2), CISSP, CCSK(CSA), Microsoft Certified Azure Security or similar • DevSecOps experience related to application deployments on multi cloud • Strong experience on cloud platforms such as Google, Amazon and Azure WHAT YOU WILL GET FROM US: • Great team of IT professionals with global working exposure • On-going professional and technical training and certifications • Global internal job opportunities available within DPDHL • A multicultural environment • Meal Card and Flexible Benefits – customized according to individual needs • Choose any day for your vacation from earned public holiday (Saturday and ad hoc) • Smart casual dress code • Company Outpatient Medical • Home office possibilities Sounds good?
elasticsearch AWS Python Google BigQuery Google Cloud Platform Numpy Pandas Gitlab What you will do Design and develop innovative algorithms to power a personalized shopping experience, leveraging cutting-edge machine learning techniques Deploy your solutions into production, taking full ownership and ensuring high performance and scalability Combine your data science expertise with a pragmatic, agile approach to find innovative solutions and drive measurable results within a fast-paced environment Challenge the status quo by identifying areas for improvement in existing retrieval and reranking systems, particularly those relying heavily on business logic, and propose data-driven solutions Thrive in a dynamic, fast-paced environment with a flat hierarchy, where your ideas and contributions can make a real difference Who you are Proficiency in Python or experience with at least one scientific computing language (e.g., MATLAB, R, Julia, C++) Strong SQL skills with experience in analytical or transactional database environments Theoretical understanding of machine learning principles, coupled with a hands-on approach to building and iterating on models Proven experience in building and deploying machine learning solutions that deliver tangible business value Strong understanding of data structures, algorithms, and tools for efficiently handling large datasets (e.g. pandas, numpy, dask, arrow, polars, …) Experience designing, building, and managing data pipelines Familiarity with cloud-based model training and serving platforms (e.g., GCP Vertex AI, Amazon SageMaker) Solid understanding of statistical methods for model evaluation Big Data: Experience analyzing large datasets using statistical and machine learning techniques DevOps: Familiarity with CI/CD tools (e.g., GitLab CI/CD, Hashicorp Terraform) is a plus Generative AI: Experience with generative AI and agentic frameworks (e.g., LangChain, ADK, CrewAI, Pydantic AI, …) is a plus Understanding of recommendation, retrieval and reranking systems in e-commerce and retail is a plus Excellent written and verbal communication skills in English Ability to effectively communicate complex machine learning concepts to both technical and non-technical stakeholders Proven ability to collaborate effectively within a team to establish standards and best practices for deploying machine learning models A proactive approach to knowledge sharing and fostering a quick development environment Nice to have Experience with BigQuery Knowledge of time series and (graph) neural network models Familiarity with statistical testing and Gaussian Processes Strong Knowledge of Computer Vision libraries, (e.g. OpenCV, TensorFlow, PyTorch) Experience maintaining Machine Learning pipelines through MLOps frameworks (e.g.
Review operational costs, negotiate contracts with vendors, and manage vendor relationships. Requirements: Bachelor's degree in Engineering, Computer Science, or a related field. HV Authorised Person (Experienced with HV Systems) Electrical/Mechanical Engineering HNC or HND (Successfully completed apprenticeship in either) C&G Pts. 1 & 2, equivalent or exceeds. 17th Edition IEE: Wiring and Installation (Ability to attain 18th Edition through additional training) C&G 2391 test and inspection; BS 7671:2001 for inspection, testing and certification.
Principal Accountabilities: Collaboration in projects of the European Data Science & Advanced Analytics Team.Concept, design, development and execution of complex innovative AI/Machine Learning solutions as well as execution and implementation of concept studies using advanced statistical methods.Development of deep learning models for structured medical concept extraction from unstructured data.Productionalization of machine learning algorithms in Big Data platforms.Application of modern data mining and machine learning techniques in connection with Healthcare Big Data to identify complex relationships and link heterogeneous data sources.Advanced usage of Large Language Models for summarization, chatbot, entity extraction etc.Develop foundational Deep Learning Models for assets and patients.Builds and trains new production grade algorithms that can learn from complex, high dimensional data to uncover patterns from which machine learning models and applications can be developed. Our Ideal Candidate Will Have: Master’s degree in Computer Science, Mathematics/Statistics, Economics/Econometrics or related field.Substantial years of professional experience in quantitative data analysis or PhD with at least 1 year of relevant professional experience with research in machine learning algorithms.Very good knowledge and in depth understanding of Machine Learning methods, both classical and deep learning models.Relevant experience with Natural Language Processing (NLP) models for extracting structured concepts from unstructured free text, including the design, training, and evaluation of information‑extraction pipelines.Very strong technical capability in Python, SQL, Hadoop ecosystem.Experience applying AI/Machine Learning methods to business questions.Very good knowledge of the higher statistical and econometric methods in theory and practice.Experience with handling Big Data.Ability to write clean, reusable, production-level codeExcellent communication skills (written and oral) including technical aspects of a project, ability to develop usable documentation, results interpretation and business recommendations.Strong analytic mindset and logical thinking capability, strong QC mindset.Knowledge of pharmaceutical market and experience with pharmaceutical data (medical, hospital, pharmacy, claims data) would be a plus, but not a must.Self-responsible for managing projects.Fluency in German & English.