. --------------------------------------------- Senior Data Scientist Data Science & Advanced Analytics Frankfurt, Germany The Senior Data Scientist is responsible for designing, deploying and continuously optimizing scalable ML solutions that translate business requirements into measurable impact across EMEA markets. The position combines technical execution with business alignment, ensuring solutions remain adaptable to evolving commercial needs while maintaining architecture consistency and standardization.
What You’ll Do Drive Measurable Business ImpactIndependently lead AI and analytics initiatives that generate tangible business valueActively contribute to and influence the company’s strategic directionApply Advanced Analytics & AIDevelop and apply advanced statistical methods in an agile environmentWork on business-critical questions using: Regression models, Time series analysis and Machine learning & AI algorithmsCollaborate cross-functionally or drive initiatives independentlyTurn Data into ActionDesign and execute analyses on large datasetsTranslate findings into clear, actionable recommendationsWork across the full spectrum — from Excel-based analysis to deep learning models Build Scalable AI SolutionsDevelop and own customized Data Science and AI solutionsWork with SQL on Azure and Snowflake platforms, Python is nice to haveLead projects end-to-end: from Proof of Concept (PoC) to fully operational production modelsCreate audience-tailored presentationsTranslate complex analytical insights into clear business languageDeliver compelling data storytelling to support decision-making at all levels What makes you stand out Minimum 3 years of experience in Data Science, AI, Advanced Analytics, or similar rolesProven ability to generate measurable business value through data-driven solutionsStrong expertise in statistical modeling and machine learning techniquesHands-on experience with: Python, SQL, Azure and SnowflakeExperience building scalable models from PoC to productionStrong communication and stakeholder management skillsAbility to explain complex topics to both technical and non-technical audiencesBachelor’s or Master’s degree in Finance, Statistics, Computer Science, Mathematics, or a related quantitative field We are looking forward to your application and to applicants who enrich our diverse culture!
Mein Arbeitgeber Als erfolgreicher, wachsender mittelständischer Anbieter relevanter und innovativer daten- und KI-getriebener Lösungen zur Optimierung der Entscheidungsfindung für namhafte Unternehmen bietet dieser Arbeitgeber eine aufgeschlossene, agile und von Fortschritt geprägte Kultur, in der Teamwork, Eigenverantwortung und kontinuierliches Lernen von hoher Bedeutung sind Gestaltungsspielraum in anspruchsvollen Projekten, langfristige Perspektiven und Flexibilität sind inklusive Konzeption, Umsetzung und Weiterentwicklung von Generative-AI- und NLP-Systemen, die Kundenanforderungen in Bezug auf Performance, Latenz, Kosten und Erweiterbarkeit optimal erfüllen Regelmäßige Abstimmung mit Stakeholdern bzgl. der Anforderungen Enge Zusammenarbeit mit EntwicklerInnen und Technical Leads für die Umsetzung und Use Cases wie Retrieval-basierten Chatbots, Agentensystemen oder Fine-Tuning von Sprachmodellen Planung und Implementierung robuster Machine Learning-Pipelines nach Best Practices - auf Azure, AWS oder GCP Input bei komplexen technischen Herausforderungen sowie Präsentation von Lösungsansätzen Aktive Verfolgung neuer Entwicklungen in NLP und KI, um KundInnen stets moderne, hochwertige Lösungen anzubieten Erfolgreich abgeschlossenes Studium der (Wirtschafts-)Informatik oder eine vergleichbare Ausbildung Einschlägige Berufserfahrung in Data Science und idealerweise im Umgang mit mehreren der typischen NLP-/LLM-Tools wie OpenAI APIs, Bedrock, Azure AI Foundry, LangChain, LangGraph, Instructor, Hugging Face, Tokenizers, Vektordatenbanken, performanter Inferenz, Model Deployment, MCP/A2A und Datensatz-Erstellung Sehr gute Kenntnisse in Machine Learning und Deep Learning, insbesondere im Bezug auf Transformer-Modelle, LLMs und Generative KI Sicherer Umgang mit produktionsreifen Frameworks wie PyTorch sowie mit Agent-Frameworks wie LangGraph, SmolAgents, OpenAI Agent SDK, CrewAI oder PydanticAI Tiefes Verständnis der Model-Optimierung mit PEFT wie QLoRA, Instruction Fine-Tuning, Post-Training, Inference-Optimierung und Embeddings Sicher in Workflows wie Conversational AI, RAG, Info Extraction, Tool Calling, LLM-Evaluation Kenntnisse in Agentic RAG, GraphRAG, Multi-Agent-Systemen, Text-to-SQL und Code Retrieval Sehr gutes Verständnis für Deployments und MLOps auf Azure, GCP oder AWS Hoher Anspruch an Softwarequalität und die Fähigkeit, sauberen, performanten, skalierbaren Code zu schreiben und KI-Systeme produktiv einzusetzen Die Fähigkeit, komplexe Anforderungen in technische Lösungen zu übersetzen und auch Fachfremden sicher zu vermitteln sowie gute Deutsch- und Englischkenntnisse Eigenverantwortliches Arbeiten und Mitgestaltungsmöglichkeiten durch kurze Entscheidungswege Weiterentwicklung durch Fokus auf Innovation: Die vielfältigen und abwechslungsreichen Projekte drehen sich um die Entwicklung intelligenter Algorithmen, datenbasierter Strategien und maßgeschneiderter KI-Lösungen Engagiertes, dynamisches, konstruktives Team mit starkem Zusammenhalt und einer offenen Feedback-Kultur Sehr gut angebundene, moderne Räumlichkeiten und hochwertiges technisches Equipment Flexibel planbarer remote-Anteil von bis zu 40%, sogar zweitweise aus dem EU-Ausland Zuschuss zum Deutschland-Ticket, zu Sport- und Wellnessangeboten sowie zur Kinderbetreuung Gehaltsinformationen Erfahrungsabhängig bis zu 100.000 € p.a.
Maintain and troubleshoot data integration pipelines to ensure stable data flow into AI and analytics systems Support model development by assisting with training, validation, and optimization of machine learning workflows Conduct data analysis to extract insights and provide clear reports supporting R&D research questions Solve technical challenges related to data access, pipeline performance, and software limitations Ensure continuity of ongoing projects by aligning closely with the core team and delivering on timelines Perform image analysis and prepare datasets required for scientific and ML use cases Manage and improve ETL processes to ensure data quality, structure, and availability Document workflows, pipeline changes, and analytical steps to ensure clarity and reproducibility Academic background in computer science, data science, engineering, or a related quantitative field Strong proficiency in Python with expertise in scientific and analytical libraries Skilled in SQL and working with relational databases Understanding of ETL concepts and practical experience working with data pipelines Solid foundation in machine learning principles and model lifecycle Ability to perform image analysis for scientific or research applications Strong communication and interpersonal skills with the ability to collaborate in a technical team Independent, structured problem-solver with a commitment to clear documentation and FAIR data practices Opportunity to contribute directly to active R&D projects with immediate real-world impact Hands-on involvement in AI, machine learning, and data integration challenges in a scientific environment Close collaboration with a small, highly skilled technical team Ihr Kontakt Referenznummer 863771/1 Kontakt aufnehmen Telefon:+41 44 225 50 00 E-Mail: positionen@hays.ch Anstellungsart Freiberuflich für ein Projekt
Mein Arbeitgeber Als erfolgreicher, wachsender mittelständischer Anbieter relevanter und innovativer daten- und KI-getriebener Lösungen zur Optimierung der Entscheidungsfindung für namhafte Unternehmen bietet dieser Arbeitgeber eine aufgeschlossene, agile und von Fortschritt geprägte Kultur, in der Teamwork, Eigenverantwortung und kontinuierliches Lernen von hoher Bedeutung sind Gestaltungsspielraum in anspruchsvollen Projekten, langfristige Perspektiven und Flexibilität sind inklusive Konzeption, Umsetzung und Weiterentwicklung von Generative-AI- und NLP-Systemen, die Kundenanforderungen in Bezug auf Performance, Latenz, Kosten und Erweiterbarkeit optimal erfüllenRegelmäßige Abstimmung mit Stakeholdern bzgl. der Anforderungen Enge Zusammenarbeit mit EntwicklerInnen und Technical Leads für die Umsetzung und Use Cases wie Retrieval-basierten Chatbots, Agentensystemen oder Fine-Tuning von SprachmodellenPlanung und Implementierung robuster Machine Learning-Pipelines nach Best Practices - auf Azure, AWS oder GCP Input bei komplexen technischen Herausforderungen sowie Präsentation von LösungsansätzenAktive Verfolgung neuer Entwicklungen in NLP und KI, um KundInnen stets moderne, hochwertige Lösungen anzubieten Erfolgreich abgeschlossenes Studium der (Wirtschafts-)Informatik oder eine vergleichbare AusbildungEinschlägige Berufserfahrung in Data Science und idealerweise im Umgang mit mehreren der typischen NLP-/LLM-Tools wie OpenAI APIs, Bedrock, Azure AI Foundry, LangChain, LangGraph, Instructor, Hugging Face, Tokenizers, Vektordatenbanken, performanter Inferenz, Model Deployment, MCP/A2A und Datensatz-ErstellungSehr gute Kenntnisse in Machine Learning und Deep Learning, insbesondere im Bezug auf Transformer-Modelle, LLMs und Generative KISicherer Umgang mit produktionsreifen Frameworks wie PyTorch sowie mit Agent-Frameworks wie LangGraph, SmolAgents, OpenAI Agent SDK, CrewAI oder PydanticAITiefes Verständnis der Model-Optimierung mit PEFT wie QLoRA, Instruction Fine-Tuning, Post-Training, Inference-Optimierung und Embeddings Sicher in Workflows wie Conversational AI, RAG, Info Extraction, Tool Calling, LLM-EvaluationKenntnisse in Agentic RAG, GraphRAG, Multi-Agent-Systemen, Text-to-SQL und Code RetrievalSehr gutes Verständnis für Deployments und MLOps auf Azure, GCP oder AWSHoher Anspruch an Softwarequalität und die Fähigkeit, sauberen, performanten, skalierbaren Code zu schreiben und KI-Systeme produktiv einzusetzenDie Fähigkeit, komplexe Anforderungen in technische Lösungen zu übersetzen und auch Fachfremden sicher zu vermitteln sowie gute Deutsch- und Englischkenntnisse Eigenverantwortliches Arbeiten und Mitgestaltungsmöglichkeiten durch kurze EntscheidungswegeWeiterentwicklung durch Fokus auf Innovation: Die vielfältigen und abwechslungsreichen Projekte drehen sich um die Entwicklung intelligenter Algorithmen, datenbasierter Strategien und maßgeschneiderter KI-LösungenEngagiertes, dynamisches, konstruktives Team mit starkem Zusammenhalt und einer offenen Feedback-KulturSehr gut angebundene, moderne Räumlichkeiten und hochwertiges technisches EquipmentFlexibel planbarer remote-Anteil von bis zu 40%, sogar zweitweise aus dem EU-AuslandZuschuss zum Deutschland-Ticket, zu Sport- und Wellnessangeboten sowie zur Kinderbetreuung Gehaltsinformationen Erfahrungsabhängig bis zu 100.000 € p.a.
Maintain and troubleshoot data integration pipelines to ensure stable data flow into AI and analytics systemsSupport model development by assisting with training, validation, and optimization of machine learning workflowsConduct data analysis to extract insights and provide clear reports supporting R&D research questionsSolve technical challenges related to data access, pipeline performance, and software limitationsEnsure continuity of ongoing projects by aligning closely with the core team and delivering on timelinesPerform image analysis and prepare datasets required for scientific and ML use casesManage and improve ETL processes to ensure data quality, structure, and availabilityDocument workflows, pipeline changes, and analytical steps to ensure clarity and reproducibility Academic background in computer science, data science, engineering, or a related quantitative fieldStrong proficiency in Python with expertise in scientific and analytical librariesSkilled in SQL and working with relational databasesUnderstanding of ETL concepts and practical experience working with data pipelinesSolid foundation in machine learning principles and model lifecycleAbility to perform image analysis for scientific or research applicationsStrong communication and interpersonal skills with the ability to collaborate in a technical teamIndependent, structured problem-solver with a commitment to clear documentation and FAIR data practices Opportunity to contribute directly to active R&D projects with immediate real-world impactHands-on involvement in AI, machine learning, and data integration challenges in a scientific environmentClose collaboration with a small, highly skilled technical team Ihr Kontakt Referenznummer 863771/1 Kontakt aufnehmen Telefon:+41 44 225 50 00 E-Mail: positionen@hays.ch Anstellungsart Freiberuflich für ein Projekt
elasticsearch AWS Python Google BigQuery Google Cloud Platform Numpy Pandas Gitlab What you will do Design and develop innovative algorithms to power a personalized shopping experience, leveraging cutting-edge machine learning techniques Deploy your solutions into production, taking full ownership and ensuring high performance and scalability Combine your data science expertise with a pragmatic, agile approach to find innovative solutions and drive measurable results within a fast-paced environment Challenge the status quo by identifying areas for improvement in existing retrieval and reranking systems, particularly those relying heavily on business logic, and propose data-driven solutions Thrive in a dynamic, fast-paced environment with a flat hierarchy, where your ideas and contributions can make a real difference Who you are Proficiency in Python or experience with at least one scientific computing language (e.g., MATLAB, R, Julia, C++) Strong SQL skills with experience in analytical or transactional database environments Theoretical understanding of machine learning principles, coupled with a hands-on approach to building and iterating on models Proven experience in building and deploying machine learning solutions that deliver tangible business value Strong understanding of data structures, algorithms, and tools for efficiently handling large datasets (e.g. pandas, numpy, dask, arrow, polars, …) Experience designing, building, and managing data pipelines Familiarity with cloud-based model training and serving platforms (e.g., GCP Vertex AI, Amazon SageMaker) Solid understanding of statistical methods for model evaluation Big Data: Experience analyzing large datasets using statistical and machine learning techniques DevOps: Familiarity with CI/CD tools (e.g., GitLab CI/CD, Hashicorp Terraform) is a plus Generative AI: Experience with generative AI and agentic frameworks (e.g., LangChain, ADK, CrewAI, Pydantic AI, …) is a plus Understanding of recommendation, retrieval and reranking systems in e-commerce and retail is a plus Excellent written and verbal communication skills in English Ability to effectively communicate complex machine learning concepts to both technical and non-technical stakeholders Proven ability to collaborate effectively within a team to establish standards and best practices for deploying machine learning models A proactive approach to knowledge sharing and fostering a quick development environment Nice to have Experience with BigQuery Knowledge of time series and (graph) neural network models Familiarity with statistical testing and Gaussian Processes Strong Knowledge of Computer Vision libraries, (e.g.
Collaborate with data scientists and engineers to break down abstract business challenges into a defined product vision, MVPs, and iterative delivery plans. Act as a bridge between technical and business teams, simplifying complex technical concepts and data insights, making them understandable for stakeholders. You’ll also translate business needs into technical requirements.
What you will do Gather, analyze, and interpret data to fulfill reporting requests from business stakeholders Translate complex data into clear, actionable insights that support business strategy Support ad hoc analysis and contribute to data-driven decision-making Automate routine reporting processes and maintain existing reports to ensure reliability and improve data workflows Maintain, design, and implement advanced dashboards using tools like Google Looker Studio, enabling self-service analytics across the organization Collaborate with data engineers, data scientists, and stakeholders across the organization to ensure data quality and consistency while delivering data-driven insights that support supply-related business decisions Communicate findings effectively to technical and non-technical audiences Foster a data-driven culture within the organization, promoting the use of analytics in decision-making processes Who you are You have at least 1-2 years of experience as a Data Analyst Proficiency in SQL for complex data analysis, reporting, and querying large datasets Experience with Google BigQuery Experience with data visualization tools (e.g., Looker Studio, Excel, or similar) to create compelling dashboards and reports Excellent communication skills in English Ability to translate fuzzy business requirements from diverse stakeholders into analytical requirements.
The Oberalp Group is a management driven family business - a house of brands that creates high quality technical mountaineering products. We have 6 own brands Salewa, Dynafit, Pomoca, Wild Country, Evolv and LaMunt. As an exclusive partner of other brands in the sports sector we offer our entire know how in communication, sales and image building.
The Oberalp Group is a management driven family business - a house of brands that creates high quality technical mountaineering products. We have 6 own brands Salewa, Dynafit, Pomoca, Wild Country, Evolv and LaMunt. As an exclusive partner of other brands in the sports sector we offer our entire know how in communication, sales and image building.
What you can expect You take on the functional and disciplinary responsibility for all FTEs in the Sales Data Hub within the federated data setup, including Data Engineers, Data Scientists, Data Governance roles and Product Owners Data.You define, design, develop, and operate cloud‑based data products for the Sales, Marketing, and Customer Service business units.You are responsible for the methodological integration of data assets and data products.You manage the Sales Data Hub operationally and further develop it as a specialized unit for data‑driven solutions in Sales, Marketing, and Customer Service.You are responsible for the further development and ongoing maintenance of all sales‑related models based on feedback and requirements from the sales organization.You lead projects related to planning, expanding, and organizing new and existing products in collaboration with the relevant business units and external partners.You assume technical responsibility for data products developed by or for the Sales, Marketing, and Customer Service areas within the Data Intelligence & Analytics team.You drive the continuous expansion, professionalization, and organizational development of the Sales Data Hub within the existing governance and organizational framework.
To meet the ambitious growth targets of the specialized nutrition business in EMEAI, ADM WILD is seeking for an experienced “Technical Business Manager (gn)” who will take a crucial role in leading and driving technical business activities. Positionsbeschreibung With your passion for (Sport) Nutrition and your commercial mindset you develop, execute and implement all specialized nutrition projects in the EMEAI region You lead and support the development of powder blends in the EMEAI region, working closely with other units and other cross-category teams As technical customer contact and proactive technical business driver you provide guidance, support and technical management of all accounts and projects for this category You build-up, lead, coach, and train a team of scientists and technicians.
Work data-driven, based on reports and analyses, to derive actions oriented toward customer needs Manage and communicate with all relevant stakeholders Understand stakeholder demands, needs, and issues – translate them into actionable & feasible product requirements and build scalable tech solutions Triaging operative issues, identifying technical dependencies, and evaluating business impact Lead cross-team projects and collaborate closely with other tech teams to execute end-to-end solutions Who you are Master’s degree or equivalent in a relevant field 7–10 years of experience in product management for tech products within software development, focused on backend systems Proven expertise in agile methodologies and collaboration with developers, data scientists, and QA managers Strong understanding of technology, business processes, and functional dependencies Hands-on experience with e-commerce systems, ideally in retail Skilled in IT project management and leading cross-functional initiatives At least 2 years of team leadership experience Analytical, detail-oriented, and customer-focused with a proactive mindset Exceptional communication skills in English, adaptable to diverse perspectives Comfortable in international work environments, with a focus on efficiency and solutions Benefits Hybrid working Fresh fruit every day Sports courses Free access to code.talks Exclusive employee discounts Free drinks Language courses Laracast account for free Company parties Help in the relocation process Mobility subsidy State-of-the-art technology Central Location Flexible Working Hours Company pension Professional training Dog-friendly office Remote AY Academy Feedback Culture Job Bikes YOU ARE THE CORE OF ABOUT YOU.
You ensure that customers see the most relevant products in the right order – balancing customer needs, business goals, and technical feasibility. We work with over 1 million real-time API calls per hour and a huge amount of data to deliver highly performant and intelligent sorting mechanisms to millions of users.