Make an impact: Development of our Customer Relationship Management (CRM) and Supplier Relationship Management (SRM) based on Dynamics 365 OnlineDevelopment of further business applications based on different modular elements of the MS Cloud Stack (Power Apps, Power Automate, Azure etc.)Working with state-of-the-art tools and development methodsCo-determination and design of our development methodology, release processes and agile approachSupport of our development team and management of external developersEffective collaboration with our CRM / SRM team and surrounding projects and product teamsActive design in solution design from the technical requirement to the finished solution What makes you stand out Education: successfully completed education in (business) computer science or a comparable education levelAt least 5 years' experience in the development of .NET, Dynamics, and Azure cloud components (C #, Javascript, DevOps, Visual Studio etc.)
Assist with quality assurance of activities and educate less experienced colleagues in related areas.Liaises with staff for the development of system enhancements to overcome known problems or further fulfil user requirements Be responsible for the adoption of new software releases from systems development staff or software suppliers and ensuring the continued support of the service thereafter Act as Service Owner for regional service, Global or Top Service By Revenue Be assigned specific project related tasks by Project Manager APPLICATION YOU WILL USE: Unix / Linux commandShell ScriptingOracle SQL WHAT YOU SHOULD HAVE: Degree in Computer Science, Information Systems or equivalent experience 3–5 years of working experience in performing L2 application support of large enterprise application Must have hands on knowledge on scripting languages like Java script and Shell Script.
As part of our team, you will take on the following responsibilities: You develop and maintain cloud‑based data pipelines and data products specifically for the Finance domain, built on our Snowflake database.You integrate financial and transactional data from various sources into our central Data Platform, optimize ETL processes, and ensure high data quality.You design and build new DataFlows and DataSets for Finance use cases and create as well as manage the corresponding Power BI reports.You collaborate closely with Finance Product Owners, Data Scientists, and Business Units to deliver meaningful analytical data products.You take ownership of data governance for all Finance data products and actively drive their further development.You support the modernization of our Finance data architecture and contribute to the migration towards cloud‑ and big‑data‑based technologies. What makes you stand out You hold a degree in Computer Science, Business Informatics, or a comparable qualification.You have strong expertise in SQL databases, including data modeling, table design, and data querying — ideally with experience in Snowflake.You bring experience in developing and integrating data products and are familiar with Azure Data Factory, Python, and modern cloud technologies (preferably Azure).You have hands‑on experience creating meaningful Power BI reports; knowledge of CI/CD or container technologies (e.g., Docker, Kubernetes) is a plus.You work analytically, communicate effectively, collaborate well in teams, and demonstrate a strong hands‑on mentality.You are fluent in English and have very good German skills; you also have a passion for financial databases and enjoy exploring new topics.
Lesen Sie weiter… Mit Ihren Kenntnissen aus dem Studium und Ihrer Leidenschaft für das Thema können Sie bei den folgenden Aufgaben Erfahrungen bei uns sammeln: Recherche über die unterschiedlichen Clusterstrukturen Evaluation der geeignetsten Orchestrierungssoftware hinsichtlich Skalierbarkeit und Portabilität Konzeptionierung und Entwicklung des Clusters inklusive Netzwerk und Automatisierungsmöglichkeiten Dokumentation der Ergebnisse Austausch mit Kolleginnen und Kollegen unterschiedlicher Fachabteilungen mehr weniger WAS SIE DAFÜR MITBRINGEN SOLLTEN Immatrikulation im Studiengang Informatik, Elektro- und Informationstechnik, Computer Science, Flug- und Fahrzeuginformatik oder einem vergleichbaren technischen Studiengang Grundlegende Kenntnisse im Bereich Virtualisierung (virtuelle Maschinen/Container) Erfahrungen im Bereich Cloud-Infrastruktur Ausgeprägte technische, analytische und konzeptionelle Fähigkeiten Selbstständige und ergebnisorientierte Denk- und Arbeitsweise mehr weniger WAS WIR IHNEN BIETEN WERDEN Das Honorar beträgt 6.125,00 Euro und wird in monatlichen Abschlägen, abhängig von der Vertragslaufzeit, ausbezahlt Bei sehr guter Bewertung der Abschlussarbeit (Note < 1,5) zusätzlicher Bonus in Höhe von 250,00 Euro Anspruchsvolles Thema und gemeinsame Definition Freie Zeiteinteilung Fachliche Expertise Arbeitgebersubventioniertes Betriebsrestaurant mit vielseitigem Angebot Internationales Umfeld Kostenfreies Fitness-Studio MBDA Studentennetzwerk Wollen Sie mehr über uns als Unternehmen und Arbeitgeber, die verschiedenen Tätigkeitsbereiche oder unsere Werte erfahren?
elasticsearch AWS Python Google BigQuery Google Cloud Platform Numpy Pandas Gitlab What you will do Design and develop innovative algorithms to power a personalized shopping experience, leveraging cutting-edge machine learning techniques Deploy your solutions into production, taking full ownership and ensuring high performance and scalability Combine your data science expertise with a pragmatic, agile approach to find innovative solutions and drive measurable results within a fast-paced environment Challenge the status quo by identifying areas for improvement in existing retrieval and reranking systems, particularly those relying heavily on business logic, and propose data-driven solutions Thrive in a dynamic, fast-paced environment with a flat hierarchy, where your ideas and contributions can make a real difference Who you are Proficiency in Python or experience with at least one scientific computing language (e.g., MATLAB, R, Julia, C++) Strong SQL skills with experience in analytical or transactional database environments Theoretical understanding of machine learning principles, coupled with a hands-on approach to building and iterating on models Proven experience in building and deploying machine learning solutions that deliver tangible business value Strong understanding of data structures, algorithms, and tools for efficiently handling large datasets (e.g. pandas, numpy, dask, arrow, polars, …) Experience designing, building, and managing data pipelines Familiarity with cloud-based model training and serving platforms (e.g., GCP Vertex AI, Amazon SageMaker) Solid understanding of statistical methods for model evaluation Big Data: Experience analyzing large datasets using statistical and machine learning techniques DevOps: Familiarity with CI/CD tools (e.g., GitLab CI/CD, Hashicorp Terraform) is a plus Generative AI: Experience with generative AI and agentic frameworks (e.g., LangChain, ADK, CrewAI, Pydantic AI, …) is a plus Understanding of recommendation, retrieval and reranking systems in e-commerce and retail is a plus Excellent written and verbal communication skills in English Ability to effectively communicate complex machine learning concepts to both technical and non-technical stakeholders Proven ability to collaborate effectively within a team to establish standards and best practices for deploying machine learning models A proactive approach to knowledge sharing and fostering a quick development environment Nice to have Experience with BigQuery Knowledge of time series and (graph) neural network models Familiarity with statistical testing and Gaussian Processes Strong Knowledge of Computer Vision libraries, (e.g. OpenCV, TensorFlow, PyTorch) Experience maintaining Machine Learning pipelines through MLOps frameworks (e.g.
What makes you stand out Proven experience in RPA development (minimum 3 years) using tools like UiPath, Automation Anywhere, or Blue Prism.Strong programming skills in languages such as C#, .NET, Java, or Python.Knowledge of software development methodologies, Agile frameworks, and version control.Experience with process mapping and business process analysis.Strong problem-solving and debugging skills.Excellent communication skills, with the ability to collaborate effectively with cross-functional teams.Bachelor’s degree in Computer Science, Information Technology, or a related field. What we offer: Attractive benefits packageRegina Maria medical subscription for you and your family according with internal policeMeal vouchersSpecial discounts for gymsAdditional days offDKV card with discount for fuel valid in 25 countriesWhen you are in the office can have free facilities like massage, fruits, coffeeFlexible working hours with home office and flextimeContinuous training and much more...
Responsibilities Lead end-to-end implementation and migration projects for enterprise clients Define and document target architectures, integrating SCAYLE with client systems (ERP, PIM, CRM) Provide guidance on SCAYLE platform capabilities, APIs, and composable commerce Translate business requirements into technical specifications and actionable work packages Collaborate with external partners to ensure delivery quality and standards compliance Review and validate integrations: REST APIs, webhooks, and asynchronous data flows Support cross-functional teams in resolving technical blockers Requirements Bachelor’s or Master’s degree in Computer Science, Information Systems, or related field 5+ years in enterprise ecommerce architecture, digital consulting, or technical lead roles Proven experience leading large-scale SaaS or MACH replatforming projects Deep knowledge of e-commerce architectures: Storefronts, OMS, PIM, CMS, ERP ntegrations Expert in MACH principles (Microservices, API-first, Cloud-native, Headless) Hands-on experience with REST APIs, webhooks, and event-driven architectures Strong problem-solving, consultative mindset, and clear communication skills YOU ARE THE CORE OF OUR COMPANY We take responsibility for creating an inclusive and exceptional environment where all genders, nationalities and ethnicities feel welcomed and accepted exactly as they are.
Your Profile - Qualifications • Bachelor’s/Master’s/PhD in Computer Science, Applied Mathematics, Engineering, or related field. • Strong background in machine learning, deep learning. • Proven experience with 3D graphics, computational geometry (meshes, point clouds, surface reconstruction) or computer vision • Proficiency in Python, C++, and ML frameworks (TensorFlow, PyTorch). • Experience with medical imaging data (CT, CBCT, intraoral scans) is a plus. • Full professional proficiency in English is required Preferred Skills • Familiarity with GANs, diffusion models, or neural rendering for material reconstruction. • Familiarity with biomedical applications. • Strong problem-solving skills and ability to work in interdisciplinary teams. • Excellent communication and documentation abilities. • Large language modules (plus) What We Offer • Opportunity to work on cutting-edge applications in digital dentistry and orthodontics. • Collaborative environment with experts in ML, graphics, and healthcare. • Competitive salary and benefits package. • Career growth in a rapidly evolving field.
What makes you stand out You hold a degree in (business) mathematics, computer science, economics, or a comparable field.You also bring relevant professional experience in the data domain, for example as a BI Engineer or Data Engineer.You have strong knowledge of SQL and Python, as well as proven experience in developing cloud‑based data pipelines using Snowflake, Azure Data Factory, DBT, and Azure Data Lake.You stand out through your analytical skills and project management capabilities, as well as a strong goal‑oriented and problem‑solving mindset.You communicate confidently in German and English.
Your Mission As part of our AI platform team, you will: Work closely with AI Solution Architects, AI Engineers, and cross-functional technical teams to design, develop, and implement AI solutions across our products and services.Build and maintain a scalable, secure, and efficient infrastructure for our AI platform while continuously expanding your expertise in cloud services.Translate business requirements into functional AI solutions — guided by experienced colleagues in an agile development environment.Develop and maintain integrations between our AI platform (e.g., Conversational AI) and internal systems to support diverse business needs.Create demos and proof-of-concepts that showcase AI capabilities and promote AI adoption within the company.Act as a technical expert for our agentic AI platform, supporting other teams in leveraging AI and building their own solutions. What makes you stand out Degree in Computer Science or a related field.4+ years of experience in agile software development with solid understanding of the full engineering lifecycle.Hands-on experience with: Large Language Models (LLMs), Retrieval-Augmented Generation (RAG), MCP servers and Agentic AI systemsDesigning, building, and deploying advanced AI solutionsExperience with AI cloud platforms (e.g., Azure AI Foundry) and/or conversational AI platforms (e.g., Cognigy).DevOps experience, ideally within Microsoft Azure (ADO).Programming experience in at least one high-level language such as Python, Java, or TypeScript.Experience developing microservices using frameworks like Flask, Django, or Spring (Java).
What makes you stand out You hold a Master’s degree in Economics, Mathematics, Statistics, Computer Science, or another related quantitative field.You bring at least 3 years of professional Data Science experience and have a proven track record of bringing models from the lab into real‑world production use cases.You are proficient in Python and SQL and have hands‑on experience building solutions on Azure and working with Snowflake.You have a deep understanding of statistical methods and apply them effectively in an agile, fast‑paced environment.You think in products, not experiments – model versioning, testing, and performance monitoring are second nature to you.You communicate complex topics clearly and with impact, presenting to different audiences confidently; you are fluent in English, and German skills are a plus.
What makes you stand out You have several years of experience leading teams in a payments, IT, or platform operations environment.You bring solid expertise in payment processing, ideally in the areas of card acceptance, authorization systems, or payment schemes.You hold a completed degree in Computer Science, Business Informatics, Business Administration, or a comparable qualification.You have experience operating highly available 24/7 systems as well as in incident and provider management.You also have a strong technical understanding of complex system architectures, integrations, and data flows.You have proven experience managing external service providers and leading complex, cross-functional projects.You are able to develop and implement technical concepts, feasibility analyses, and innovative solution approaches.You work in a structured and analytical manner and stand out through strong communication and stakeholder management skills.Very good German and English language skills, both written and spoken, complete your profile.
Ihr Profil Erfolgreich abgeschlossenes wissenschaftliches Hochschulstudium (Master oder vergleichbar) der Fachrichtung Informatik, Computer Engineering Science (CES), Software Engineering oder einer vergleichbaren Fachrichtung • Programmiererfahrung in C++, C#, Python, JavaScript oder vergleichbar • Grundkenntnisse in den Bereichen Edge-/Cloud-Computing, Containervirtualisierung bzw.
Rethink Retail: Stay in front of emerging trends, analyze, and design next-gen business processes across omnichannel environments. What You Bring Education: Degree in Computer Science, Information Technology, Management Information Systems, or relevant discipline. Experience: Strong experience in the Central/Latin America retail market and POS process knowledge.
YOUR TASKS Lead and manage an international SCADA deployment team of eight specialists Oversee SCADA system coordination and operational delivery Take ownership of installation and commissioning of Nordex SCADA systems Coordinate second‑level support for SCADA field issues Ensure structured handover of SCADA topics to service teams Collaborate with development teams for product improvements Provide structured reporting to senior management YOUR PROFILE Bachelor’s degree in Computer Science or similar 2–3+ years experience in system architecture/backend/software Initial leadership experience in a technical environment Experience with microservices, Docker and cloud deployment Experience with InfluxDB and MongoDB Linux operations background and agile mindset Fluent English and intercultural competence YOUR BENEFITS Nordex offers a range of attractive benefits – here’s a selection of what you can look forward to.
Ihr Profil Erfolgreich abgeschlossenes wissenschaftliches Hochschulstudium (Master oder vergleichbar) der Fachrichtung Informatik, Computer Engineering Science (CES), Software Engineering oder einer vergleichbaren Fachrichtung • Programmiererfahrung in C++, C#, Python, JavaScript oder vergleichbar • Grundkenntnisse in den Bereichen Edge-/Cloud-Computing, Containervirtualisierung bzw.
What makes you stand out You hold a degree in Business Informatics, Data Science, Computer Science, Industrial Engineering, Business Administration, or a comparable field.You have several years of experience in building, operating, or further developing data‑driven products—ideally in a sales, marketing, or customer service context.You have solid knowledge in designing and developing cloud‑based data products such as data warehouses, semantic layers, analytical models, and reporting and analytics solutions.You possess strong knowledge of data architectures, data engineering, data science, and data governance.You bring experience in leading interdisciplinary teams both functionally and disciplinarily—ideally including Data Engineers, Data Scientists, Product Owners, and Data Governance roles.You demonstrate strong analytical and conceptual thinking skills and the ability to prepare complex topics in a structured and understandable way.You communicate effectively and are able to bridge different target groups (business & tech).You work with strong execution and results orientation while ensuring high data quality.You have excellent German and English skills, both written and spoken.
WHAT YOU WILL DO •Role involves both Architecting, implementation planning and executing with some measure of hands-on work •Set up Enterprise monitoring framework and governance •Research on emerging related technologies in support of increased cost effectiveness and flexibility •Product selection inclusive of PoC & RFI/RFP support •Providing strategic directions for standardizing of monitoring landscape and efficient use of resources for proactive application and infrastructure monitoring •Establish and leverage key strategic vendor relationships •Develop and maintain build standards of related services •Design and implement automation initiatives to optimize the current state •Form and lead global virtual teams creating functional touch points with both architecture and operation teams enabling the on-boarding of new product/technologies to improve infrastructure supporting the global strategy •Deliver complex Infrastructure Projects Implementation •Infrastructure Problem Management - provide 3rd level problem management support for relevant Tools and Monitoring services Products you need to be familiar with •Microfocus OML •Splunk •Dynatrace •SCOM •Topaz •IQSonar •Microfocus Sitescope •TeamQuest •IBM Tivoli WHAT YOU SHOULD HAVE •Degree in Computer Science, Information Systems or equivalent experience •Minimum 10 years of experience in IT or related field with a minimum of 5 of those years focusing on enterprise Tools and Monitoring •End-to-end product lifecycle management from strategic planning to tactical activities •Good knowledge of Operating Systems, System Management, Hardware, Big Data, Automation tools and virtualization technologies, interoperation of all these technologies as applicable •Demonstrated skill in critical thinking and problem-solving methods •Demonstrable experience designing highly available, highly scalable production systems •Exposure to Multi cloud infrastructure •Vision for information delivery and management and driving execution of the roadmap, including enterprise data architecture, big data, analytics and data management.
WHAT YOU WILL DO •Role involves both Architecting, implementation planning and executing with some measure of hands-on work •Set up Enterprise monitoring framework and governance •Research on emerging related technologies in support of increased cost effectiveness and flexibility •Product selection inclusive of PoC & RFI/RFP support •Providing strategic directions for standardizing of monitoring landscape and efficient use of resources for proactive application and infrastructure monitoring •Establish and leverage key strategic vendor relationships •Develop and maintain build standards of related services •Design and implement automation initiatives to optimize the current state •Form and lead global virtual teams creating functional touch points with both architecture and operation teams enabling the on-boarding of new product/technologies to improve infrastructure supporting the global strategy •Deliver complex Infrastructure Projects Implementation •Infrastructure Problem Management - provide 3rd level problem management support for relevant Tools and Monitoring services APPLICATION YOU WILL USE •Microfocus OML •Splunk •Dynatrace •SCOM •Topaz •IQSonar •Microfocus Sitescope •TeamQuest •IBM Tivoli WHAT YOU SHOULD HAVE •Degree in Computer Science, Information Systems or equivalent experience •Minimum 10 years of experience in IT or related field with a minimum of 5 of those years focusing on enterprise Tools and Monitoring •End-to-end product lifecycle management from strategic planning to tactical activities •Good knowledge of Operating Systems, System Management, Hardware, Big Data, Automation tools and virtualization technologies, interoperation of all these technologies as applicable •Demonstrated skill in critical thinking and problem-solving methods •Demonstrable experience designing highly available, highly scalable production systems •Exposure to Multi cloud infrastructure •Vision for information delivery and management and driving execution of the roadmap, including enterprise data architecture, big data, analytics and data management.