What you will do Gather, analyze, and interpret data to fulfill reporting requests from business stakeholders Translate complex data into clear, actionable insights that support business strategy Support ad hoc analysis and contribute to data-driven decision-making Automate routine reporting processes and maintain existing reports to ensure reliability and improve data workflows Maintain, design, and implement advanced dashboards using tools like Google Looker Studio, enabling self-service analytics across the organization Collaborate with data engineers, data scientists, and stakeholders across the organization to ensure data quality and consistency while delivering data-driven insights that support supply-related business decisions Communicate findings effectively to technical and non-technical audiences Foster a data-driven culture within the organization, promoting the use of analytics in decision-making processes Who you are You have at least 1-2 years of experience as a Data Analyst Proficiency in SQL for complex data analysis, reporting, and querying large datasets Experience with Google BigQuery Experience with data visualization tools (e.g., Looker Studio, Excel, or similar) to create compelling dashboards and reports Excellent communication skills in English Ability to translate fuzzy business requirements from diverse stakeholders into analytical requirements.
The Oberalp Group is a management driven family business - a house of brands that creates high quality technical mountaineering products. We have 6 own brands Salewa, Dynafit, Pomoca, Wild Country, Evolv and LaMunt. As an exclusive partner of other brands in the sports sector we offer our entire know how in communication, sales and image building.
The Oberalp Group is a management driven family business - a house of brands that creates high quality technical mountaineering products. We have 6 own brands Salewa, Dynafit, Pomoca, Wild Country, Evolv and LaMunt. As an exclusive partner of other brands in the sports sector we offer our entire know how in communication, sales and image building.
We look forward to hearing from you. Manage and refine business and technical requirements in collaboration with stakeholdersCoordinate data integration activities with various source systemsDesign and model data structures within a Data Warehouse environment, with a strong focus on Data Vault methodologyDevelop and optimize data pipelines using SQL and PythonWork with tools like Databricks and dbt to build scalable data transformation workflowsEnsure data quality, consistency, and compliance, especially within banking-related use cases Experience in requirements managementExperience in coordination with source systemsExperience with data modeling in a Data Warehouse environment: Focus on Data VaultGood German and English language skillsDatabricks experience is nice to haveExperience with dbt (data build tool) is an advantageExperience with SQL (as a query language) and Python is an advantageBanking experience is an advantage Renowned clientRemote work Ihr Kontakt Ansprechpartner Florian Pracher Referenznummer 863466/1 Kontakt aufnehmen E-Mail: florian.pracher@hays.at Anstellungsart Freiberuflich für ein Projekt
We look forward to hearing from you. Manage and refine business and technical requirements in collaboration with stakeholders Coordinate data integration activities with various source systems Design and model data structures within a Data Warehouse environment, with a strong focus on Data Vault methodology Develop and optimize data pipelines using SQL and Python Work with tools like Databricks and dbt to build scalable data transformation workflows Ensure data quality, consistency, and compliance, especially within banking-related use cases Experience in requirements management Experience in coordination with source systems Experience with data modeling in a Data Warehouse environment: Focus on Data Vault Good German and English language skills Databricks experience is nice to have Experience with dbt (data build tool) is an advantage Experience with SQL (as a query language) and Python is an advantage Banking experience is an advantage Renowned client Remote work Ihr Kontakt Ansprechpartner Florian Pracher Referenznummer 863466/1 Kontakt aufnehmen E-Mail: florian.pracher@hays.at Anstellungsart Freiberuflich für ein Projekt
Design, build, and optimize batch data pipelines for internal tool use casesDevelop efficient Spark SQL transformations for large-scale datasetsUse Python for data processing, orchestration, and automationCreate and maintain data models (facts, dimensions, aggregates) with clear grain and metric definitionsEnsure data quality and correctness, including handling late data, duplicates, and adjustmentsImplement validation, data quality checks, and reconciliation logicWork with business stakeholders to gather requirements, define metrics, and translate needs into pipelinesCollaborate with infrastructure teams on standards, performance tuning, and best practices Bachelor oder Master degree in a technical field or an equivalent qualificationExperience in data engineering or a related fieldStrong proficiency in Spark SQL for large-scale data transformationsSolid Python skills for data processing and pipeline developmentStrong understanding of data modeling (fact tables, dimensions, grain, SCDs)Hands-on experience building and maintaining batch pipelines in productionHigh attention to detail with a strong focus on data quality and metric integrityAbility to communicate clearly with non-technical stakeholders and translate business needs into data solutions Remuneration in the most attractive collective agreement in the industry Annual leave entitlement of 30 days Generous working time account with the possibility to pay overtime Subsidization of direct insurance (as company pension scheme) Ihr Kontakt Ansprechpartner Kristina Meng Referenznummer 863942/1 Kontakt aufnehmen E-Mail: kristina.meng@hays.de Anstellungsart Anstellung bei der Hays Professional Solutions GmbH
Design, build, and optimize batch data pipelines for internal tool use cases Develop efficient Spark SQL transformations for large-scale datasets Use Python for data processing, orchestration, and automation Create and maintain data models (facts, dimensions, aggregates) with clear grain and metric definitions Ensure data quality and correctness, including handling late data, duplicates, and adjustments Implement validation, data quality checks, and reconciliation logic Work with business stakeholders to gather requirements, define metrics, and translate needs into pipelines Collaborate with infrastructure teams on standards, performance tuning, and best practices Bachelor oder Master degree in a technical field or an equivalent qualification Experience in data engineering or a related field Strong proficiency in Spark SQL for large-scale data transformations Solid Python skills for data processing and pipeline development Strong understanding of data modeling (fact tables, dimensions, grain, SCDs) Hands-on experience building and maintaining batch pipelines in production High attention to detail with a strong focus on data quality and metric integrity Ability to communicate clearly with non-technical stakeholders and translate business needs into data solutions Remuneration in the most attractive collective agreement in the industry Annual leave entitlement of 30 days Generous working time account with the possibility to pay overtime Subsidization of direct insurance (as company pension scheme) Ihr Kontakt Ansprechpartner Kristina Meng Referenznummer 863942/1 Kontakt aufnehmen E-Mail: kristina.meng@hays.de Anstellungsart Anstellung bei der Hays Professional Solutions GmbH
Automation of data subject information processes from intake to response and Objection handling and individual case deletionIntegration of Broker Channel and Data Product and voice transcripts into the Data ProductDataProduct Information: JSON test feedback / change requestsIntegration of customer interests (CSC) into the DataProduct for data subject rightsDesign, adaptation, and implementation of database architectureIndependent migration of existing data and structures into the new Core Data WarehouseLogical and technical modeling of Raw and Business VaultCreation of complex business transformations using SQL based on business requirementsAnalysis and implementation of existing transformation logic for Data Vault 2.0 Experience with Data Vault is necessaryDBT and SQL expertise is necessarySnowflake expertise is advantageous Start Date: asapLanguage: germanRemote option Ihr Kontakt Referenznummer 863903/1 Kontakt aufnehmen Telefon:+ 49 621 1788-4297 E-Mail: positionen@hays.de Anstellungsart Freiberuflich für ein Projekt
We support you on your journey: individual learning opportunities, world-wide job opportunities or technical training from our academy. The safety and well-being of our employees is important to us, which is why we set high standards for your workplace safety.
Automation of data subject information processes from intake to response and Objection handling and individual case deletion Integration of Broker Channel and Data Product and voice transcripts into the Data Product DataProduct Information: JSON test feedback / change requests Integration of customer interests (CSC) into the DataProduct for data subject rights Design, adaptation, and implementation of database architecture Independent migration of existing data and structures into the new Core Data Warehouse Logical and technical modeling of Raw and Business Vault Creation of complex business transformations using SQL based on business requirements Analysis and implementation of existing transformation logic for Data Vault 2.0 Experience with Data Vault is necessary DBT and SQL expertise is necessary Snowflake expertise is advantageous Start Date: asap Language: german Remote option Ihr Kontakt Referenznummer 863903/1 Kontakt aufnehmen Telefon:+ 49 621 1788-4297 E-Mail: positionen@hays.de Anstellungsart Freiberuflich für ein Projekt
Maintain and troubleshoot data integration pipelines to ensure stable data flow into AI and analytics systemsSupport model development by assisting with training, validation, and optimization of machine learning workflowsConduct data analysis to extract insights and provide clear reports supporting R&D research questionsSolve technical challenges related to data access, pipeline performance, and software limitationsEnsure continuity of ongoing projects by aligning closely with the core team and delivering on timelinesPerform image analysis and prepare datasets required for scientific and ML use casesManage and improve ETL processes to ensure data quality, structure, and availabilityDocument workflows, pipeline changes, and analytical steps to ensure clarity and reproducibility Academic background in computer science, data science, engineering, or a related quantitative fieldStrong proficiency in Python with expertise in scientific and analytical librariesSkilled in SQL and working with relational databasesUnderstanding of ETL concepts and practical experience working with data pipelinesSolid foundation in machine learning principles and model lifecycleAbility to perform image analysis for scientific or research applicationsStrong communication and interpersonal skills with the ability to collaborate in a technical teamIndependent, structured problem-solver with a commitment to clear documentation and FAIR data practices Opportunity to contribute directly to active R&D projects with immediate real-world impactHands-on involvement in AI, machine learning, and data integration challenges in a scientific environmentClose collaboration with a small, highly skilled technical team Ihr Kontakt Referenznummer 863771/1 Kontakt aufnehmen Telefon:+41 44 225 50 00 E-Mail: positionen@hays.ch Anstellungsart Freiberuflich für ein Projekt
Maintain and troubleshoot data integration pipelines to ensure stable data flow into AI and analytics systems Support model development by assisting with training, validation, and optimization of machine learning workflows Conduct data analysis to extract insights and provide clear reports supporting R&D research questions Solve technical challenges related to data access, pipeline performance, and software limitations Ensure continuity of ongoing projects by aligning closely with the core team and delivering on timelines Perform image analysis and prepare datasets required for scientific and ML use cases Manage and improve ETL processes to ensure data quality, structure, and availability Document workflows, pipeline changes, and analytical steps to ensure clarity and reproducibility Academic background in computer science, data science, engineering, or a related quantitative field Strong proficiency in Python with expertise in scientific and analytical libraries Skilled in SQL and working with relational databases Understanding of ETL concepts and practical experience working with data pipelines Solid foundation in machine learning principles and model lifecycle Ability to perform image analysis for scientific or research applications Strong communication and interpersonal skills with the ability to collaborate in a technical team Independent, structured problem-solver with a commitment to clear documentation and FAIR data practices Opportunity to contribute directly to active R&D projects with immediate real-world impact Hands-on involvement in AI, machine learning, and data integration challenges in a scientific environment Close collaboration with a small, highly skilled technical team Ihr Kontakt Referenznummer 863771/1 Kontakt aufnehmen Telefon:+41 44 225 50 00 E-Mail: positionen@hays.ch Anstellungsart Freiberuflich für ein Projekt
What you can expect You take on the functional and disciplinary responsibility for all FTEs in the Sales Data Hub within the federated data setup, including Data Engineers, Data Scientists, Data Governance roles and Product Owners Data.You define, design, develop, and operate cloud‑based data products for the Sales, Marketing, and Customer Service business units.You are responsible for the methodological integration of data assets and data products.You manage the Sales Data Hub operationally and further develop it as a specialized unit for data‑driven solutions in Sales, Marketing, and Customer Service.You are responsible for the further development and ongoing maintenance of all sales‑related models based on feedback and requirements from the sales organization.You lead projects related to planning, expanding, and organizing new and existing products in collaboration with the relevant business units and external partners.You assume technical responsibility for data products developed by or for the Sales, Marketing, and Customer Service areas within the Data Intelligence & Analytics team.You drive the continuous expansion, professionalization, and organizational development of the Sales Data Hub within the existing governance and organizational framework.
RND RedaktionsNetzwerk Deutschland GmbH sucht in eine/n Technical Data Analyst (d/m/w) (ID-Nummer: 13650300)
What You’ll Do: Collaborate in an Agile, International TeamWork closely with colleagues from Romania, Germany, and UkraineDesign, estimate, develop, and implement software solutions aligned with business needsActively communicate progress, risks, and technical decisions to stakeholdersBuild Scalable Data SolutionsDevelop agnostic data products within a modern, cloud-native data ecosystemSupport use cases across BI, Advanced Analytics, AI, and MLTranslate business requirements into robust technical architecturesContinuously enhance performance, quality, and cost-efficiency of solutionsProactively suggest improvements and best practices What makes you stand out Degree in Computer Science, Economics, or a comparable qualificationMinimum 3 years of experience as a BI Engineer or Data Engineer, focused on cloud-based architecturesStrong expertise in: Snowflake and DBT (Data Build Tool)Solid knowledge of: SQL and Data lakehouse architectures, Python is nice to haveCommunication is Key Excellent communication skills in English (written and spoken) — mandatoryAbility to clearly explain technical concepts to both technical and non-technical stakeholdersStrong stakeholder management and collaboration skillsComfortable working in cross-border, multicultural teams We are looking forward to your application and to applicants who enrich our diverse culture!
You are responsible for the conceptual, logical, and structural integrity of our Core Data Model as well as the Gold Layer across Azure, Snowflake, and dbt.You ensure that fragmented data sources are transformed into consistent, reusable, and decision‑relevant data products, actively preventing the platform from drifting into team‑specific, incompatible models.You define and maintain central business objects, canonical dimensions, shared metrics, and facts, ensuring that the Core Data Model serves as a stable, business‑oriented foundation across all domains.You develop modeling standards, naming conventions, layering concepts (Staging → Intermediate → Gold), reuse patterns, and dbt design guidelines, and you ensure their consistent implementation across all teams.You safeguard the semantic consistency of the entire data model, resolve domain conflicts, ensure that identical business terms are modeled only once, and review changes affecting core layers.You act as the technical design authority for model changes in Snowflake/dbt, balancing local requirements with long‑term model coherence, and ensuring that all models remain performant, scalable, maintainable, and of high quality.
What You’ll Do Drive Measurable Business ImpactIndependently lead AI and analytics initiatives that generate tangible business valueActively contribute to and influence the company’s strategic directionApply Advanced Analytics & AIDevelop and apply advanced statistical methods in an agile environmentWork on business-critical questions using: Regression models, Time series analysis and Machine learning & AI algorithmsCollaborate cross-functionally or drive initiatives independentlyTurn Data into ActionDesign and execute analyses on large datasetsTranslate findings into clear, actionable recommendationsWork across the full spectrum — from Excel-based analysis to deep learning models Build Scalable AI SolutionsDevelop and own customized Data Science and AI solutionsWork with SQL on Azure and Snowflake platforms, Python is nice to haveLead projects end-to-end: from Proof of Concept (PoC) to fully operational production modelsCreate audience-tailored presentationsTranslate complex analytical insights into clear business languageDeliver compelling data storytelling to support decision-making at all levels What makes you stand out Minimum 3 years of experience in Data Science, AI, Advanced Analytics, or similar rolesProven ability to generate measurable business value through data-driven solutionsStrong expertise in statistical modeling and machine learning techniquesHands-on experience with: Python, SQL, Azure and SnowflakeExperience building scalable models from PoC to productionStrong communication and stakeholder management skillsAbility to explain complex topics to both technical and non-technical audiencesBachelor’s or Master’s degree in Finance, Statistics, Computer Science, Mathematics, or a related quantitative field We are looking forward to your application and to applicants who enrich our diverse culture!
Collaborate with data scientists and engineers to break down abstract business challenges into a defined product vision, MVPs, and iterative delivery plans. Act as a bridge between technical and business teams, simplifying complex technical concepts and data insights, making them understandable for stakeholders. You’ll also translate business needs into technical requirements.
What makes you stand out You hold a degree in Business Informatics, Data Science, Computer Science, Industrial Engineering, Business Administration, or a comparable field.You bring experience in Revenue Operations, Sales Operations, Sales Analytics, or a similar commercial analytics role.You have a strong understanding of sales processes, pipeline management, forecasting, and revenue metrics, and you can translate them into technical requirements for engineers.You hold a degree in Business Informatics, Data Science, Computer Science, Industrial Engineering, Business Administration, or a comparable field.You are a great stakeholder manager who can talk to sales and engineering alike.You are highly proficient in Power BI and experienced in building dashboards that drive action.You are comfortable working with CRM data and sales systems (e.g.
Key Responsibilities Develop and maintain detailed project budgets, cost estimates, and financial forecasts for data center construction and infrastructure projects, tracking expenditures against approved budgetsPrepare comprehensive cost reports, cash flow projections, and value engineering analyses to optimize project costs while maintaining quality standards and technical requirementsCoordinate with project managers, contractors, and procurement teams to evaluate contract proposals, change orders, and variation requests while ensuring cost-effective project deliveryConduct risk assessments for cost implications, develop contingency strategies, and monitor market conditions affecting material and labor costs in data center constructionReview and validate contractor invoices, progress payments, and final accounts while maintaining detailed cost documentation and audit trails for financial complianceSupport procurement processes, tender evaluations, and contract negotiations to achieve optimal value while facilitating cost reconciliation, lessons learned, and knowledge transfer for future project Qualifications & Skills Bachelor's degree in Quantity Surveying, Construction Management, Engineering, or related field with several years of experience in cost management for data center or mission-critical facility projectsComprehensive knowledge of data center construction costs, industry pricing trends, and ability to interpret technical specifications for accurate cost estimation and budget developmentStrong financial analysis expertise with proficiency in cost management software, spreadsheet applications, and demonstrated ability to prepare detailed cost reports and forecastsExcellent analytical, communication, and stakeholder management skills with proven ability to negotiate with contractors and suppliers while managing cost-related project risksExperience with value engineering, life-cycle costing, and cost optimization techniques in complex construction environments with understanding of procurement processes and contract administration Jones Lang LaSalle SE Human Resources Ihr Ansprechpartner: Jan Bauermann Talent Acquisition Partner EMEA jan.bauermann@jll.com Location: On-site –Frankfurt am Main, DEU If this job description resonates with you, we encourage you to apply even if you don’t meet all of the requirements.
Design and implement a SQL-based landing zone for regulatory data Develop stored procedures for transformation, enrichment, and aggregation Build and operate high-volume batch processing chains for monthly/quarterly cycles Implement SSIS-based ingestion flows and job orchestration Ensure data quality, technical lineage, and full traceability across layers Define and document integration patterns and mapping logic between landing-zone datasets and Tagetik-based reporting templates Perform operational monitoring, troubleshooting, and performance optimization Strong expertise in Microsoft SQL Server and T-SQL Hands-on experience with stored-procedure-driven ETL and complex data models Solid SSIS skills for orchestration and control of processing chains Experience with batch processing, logging, restartability, and performance tuning Knowledge of data lineage, reconciliation, and regulatory processing needs Experience with reporting platforms such as Tagetik is a plus Familiarity with Oracle source systems is advantageous Renowned Client Remote Option Ihr Kontakt Ansprechpartner Eliška Stejskalová Referenznummer 862801/1 Kontakt aufnehmen E-Mail: eliska.stejskalova@hays.at Anstellungsart Freiberuflich für ein Projekt
Technical Data Wiring Engineer (m/w/d) für AIRBUS Unser Angebot: Spannende Jobs bei interessanten Unternehmen wie Airbus Operations, Airbus Defence & Space sowie der luftfahrt- und raumfahrttechnischen ZuliefererindustrieAttraktive und leistungsgerechte GehaltskonditionenUmfangreiches Mitarbeiter-Benefit-Programm Orizon PlusPunktePassende Fortbildungsmaßnahmen im Rahmen Ihrer TätigkeitenBis zu 30 Tage JahresurlaubPersönliche Betreuung und qualifizierte Beratung an den Unit Aviation Standorten wie z.B.
Fresenius Medical Care Deutschland GmbH sucht in eine/n (Senior) Data Privacy Engineer (m/f/d) / (Senior) Expert (m/f/d) – Technical Privacy (ID-Nummer: 13676634)
Design, develop and deploy digital solutions ensuring the software development life cycle in an agile setup. Create technical documentation. Analyze and decompose business requirements into technical functionalities. Produce clean and efficient code based on business requirements and specifications.
Design, develop and deploy digital solutions ensuring the software development life cycle in an agile setup. Create technical documentation. Analyze and decompose business requirements into technical functionalities. Produce clean and efficient code based on business requirements and specifications.
What you will do: Development and evaluation of statistical models and algorithms for complex marketing issues Independent analysis of complex data with the aim of identifying new insights and potential for performance optimization Identifying direct and indirect correlations between relevant key figures and deriving recommendations for action Linking and using the content of data from tracking systems and other reporting sources Support in the further development and testing of performance-relevant (attribution) models Initiation and further development of prediction and classification models using machine learning algorithms Who you are: You bring at least seven years of hands-on experience in Data Engineering, ideally in an agency, e-commerce, or performance-driven environment You have initial experience with machine learning algorithms and a solid understanding of common data analysis methods such as regression and clustering; knowledge of marketing attribution models is a strong plus You are proficient in SQL and either Python or R (both are a bonus) Experience with Dagster or comparable data orchestration tools is highly appreciated You are naturally curious, enjoy exploring new topics, statistical methods, and emerging technologies, and stay up to date with current technical developments Benefits Hybrid working Täglich frisches Obst Sportkurse Freier Zutritt zur code.talks Exklusive Mitarbeiter Rabatte Kostenlose Getränke Sprachkurse Kostenloser Laracasts Account Company Events Relocation Unterstützung Mobilitätszuschlag State-of-the-art Technologien Zentrale Lage Flexible Arbeitszeiten Betriebliche Altersvorsorge Weiterbildungs- angebote Hunde erlaubt AY Academy Feedbackkultur Firmenfahrrad YOU ARE THE CORE OF ABOUT YOU.
Design and implement a SQL-based landing zone for regulatory dataDevelop stored procedures for transformation, enrichment, and aggregationBuild and operate high-volume batch processing chains for monthly/quarterly cyclesImplement SSIS-based ingestion flows and job orchestrationEnsure data quality, technical lineage, and full traceability across layersDefine and document integration patterns and mapping logic between landing-zone datasets and Tagetik-based reporting templatesPerform operational monitoring, troubleshooting, and performance optimization Strong expertise in Microsoft SQL Server and T-SQLHands-on experience with stored-procedure-driven ETL and complex data modelsSolid SSIS skills for orchestration and control of processing chainsExperience with batch processing, logging, restartability, and performance tuningKnowledge of data lineage, reconciliation, and regulatory processing needsExperience with reporting platforms such as Tagetik is a plusFamiliarity with Oracle source systems is advantageous Renowned ClientRemote Option Ihr Kontakt Ansprechpartner Eliška Stejskalová Referenznummer 862801/1 Kontakt aufnehmen E-Mail: eliska.stejskalova@hays.at Anstellungsart Freiberuflich für ein Projekt
What you will do: Development and evaluation of statistical models and algorithms for complex marketing issues Independent analysis of complex data with the aim of identifying new insights and potential for performance optimization Identifying direct and indirect correlations between relevant key figures and deriving recommendations for action Linking and using the content of data from tracking systems and other reporting sources Support in the further development and testing of performance-relevant (attribution) models Initiation and further development of prediction and classification models using machine learning algorithms Who you are: You bring at least two years of hands-on experience in Data Engineering, ideally in an agency, e-commerce, or performance-driven environment You have initial experience with machine learning algorithms and a solid understanding of common data analysis methods such as regression and clustering; knowledge of marketing attribution models is a strong plus You are proficient in SQL and either Python or R (both are a bonus) Experience with Dagster or comparable data orchestration tools is highly appreciated You are naturally curious, enjoy exploring new topics, statistical methods, and emerging technologies, and stay up to date with current technical developments Additional information: **Working model: Due to the upcoming tasks and responsibilities for this position, it is required to work onsite at our headquarters in Hamburg or Berlin on a weekly basis.
elasticsearch AWS Python Google BigQuery Google Cloud Platform Numpy Pandas Gitlab What you will do Design and develop innovative algorithms to power a personalized shopping experience, leveraging cutting-edge machine learning techniques Deploy your solutions into production, taking full ownership and ensuring high performance and scalability Combine your data science expertise with a pragmatic, agile approach to find innovative solutions and drive measurable results within a fast-paced environment Challenge the status quo by identifying areas for improvement in existing retrieval and reranking systems, particularly those relying heavily on business logic, and propose data-driven solutions Thrive in a dynamic, fast-paced environment with a flat hierarchy, where your ideas and contributions can make a real difference Who you are Proficiency in Python or experience with at least one scientific computing language (e.g., MATLAB, R, Julia, C++) Strong SQL skills with experience in analytical or transactional database environments Theoretical understanding of machine learning principles, coupled with a hands-on approach to building and iterating on models Proven experience in building and deploying machine learning solutions that deliver tangible business value Strong understanding of data structures, algorithms, and tools for efficiently handling large datasets (e.g. pandas, numpy, dask, arrow, polars, …) Experience designing, building, and managing data pipelines Familiarity with cloud-based model training and serving platforms (e.g., GCP Vertex AI, Amazon SageMaker) Solid understanding of statistical methods for model evaluation Big Data: Experience analyzing large datasets using statistical and machine learning techniques DevOps: Familiarity with CI/CD tools (e.g., GitLab CI/CD, Hashicorp Terraform) is a plus Generative AI: Experience with generative AI and agentic frameworks (e.g., LangChain, ADK, CrewAI, Pydantic AI, …) is a plus Understanding of recommendation, retrieval and reranking systems in e-commerce and retail is a plus Excellent written and verbal communication skills in English Ability to effectively communicate complex machine learning concepts to both technical and non-technical stakeholders Proven ability to collaborate effectively within a team to establish standards and best practices for deploying machine learning models A proactive approach to knowledge sharing and fostering a quick development environment Nice to have Experience with BigQuery Knowledge of time series and (graph) neural network models Familiarity with statistical testing and Gaussian Processes Strong Knowledge of Computer Vision libraries, (e.g.
YOUR TASKS: Design, develop and deploy digital solutions ensuring the software development life cycle in an agile setup Develop solutions on a leading-edge cloud based platform for managing and analyzing large datasets Create technical documentation Analyze and decompose business requirements into technical functionalities Produce clean and efficient code based on business requirements and specifications Create Notebooks, pipelines and workflows in SCALA or Python to ingest, process and serve data in our platform Be a technical lead for junior and external developers Be a part of the continuous improvement of Nordex’ development processes by participating in retrospectives and proposing optimizations YOUR PROFILE: Technical degree in Computer Science, Software Engineering or comparable Experience or certification in Databricks Fluent English At least 3 years of proven experience Availability to travel YOUR BENEFITS: In addition to the opportunity to make our world a little more sustainable, we offer you: *Some offers may vary by location. ** Hybrid working in accordance with the company's internal policy.
YOUR TASKS: Design, develop and deploy digital solutions ensuring the software development life cycle in an agile setup Develop solutions on a leading-edge cloud based platform for managing and analyzing large datasets Create technical documentation Analyze and decompose business requirements into technical functionalities Produce clean and efficient code based on business requirements and specifications Create Notebooks, pipelines and workflows in SCALA or Python to ingest, process and serve data in our platform Be a technical lead for junior and external developers Be a part of the continuous improvement of Nordex’ development processes by participating in retrospectives and proposing optimizations YOUR PROFILE: Technical degree in Computer Science, Software Engineering or comparable Experience or certification in Databricks Fluent English At least 3 years of proven experience Availability to travel YOUR BENEFITS: In addition to the opportunity to make our world a little more sustainable, we offer you: *Some offers may vary by location. ** Hybrid working in accordance with the company's internal policy.
Mein Arbeitgeber Als erfolgreicher, wachsender mittelständischer Anbieter relevanter und innovativer daten- und KI-getriebener Lösungen zur Optimierung der Entscheidungsfindung für namhafte Unternehmen bietet dieser Arbeitgeber eine aufgeschlossene, agile und von Fortschritt geprägte Kultur, in der Teamwork, Eigenverantwortung und kontinuierliches Lernen von hoher Bedeutung sind Gestaltungsspielraum in anspruchsvollen Projekten, langfristige Perspektiven und Flexibilität sind inklusive Konzeption, Umsetzung und Weiterentwicklung von Generative-AI- und NLP-Systemen, die Kundenanforderungen in Bezug auf Performance, Latenz, Kosten und Erweiterbarkeit optimal erfüllenRegelmäßige Abstimmung mit Stakeholdern bzgl. der Anforderungen Enge Zusammenarbeit mit EntwicklerInnen und Technical Leads für die Umsetzung und Use Cases wie Retrieval-basierten Chatbots, Agentensystemen oder Fine-Tuning von SprachmodellenPlanung und Implementierung robuster Machine Learning-Pipelines nach Best Practices - auf Azure, AWS oder GCP Input bei komplexen technischen Herausforderungen sowie Präsentation von LösungsansätzenAktive Verfolgung neuer Entwicklungen in NLP und KI, um KundInnen stets moderne, hochwertige Lösungen anzubieten Erfolgreich abgeschlossenes Studium der (Wirtschafts-)Informatik oder eine vergleichbare AusbildungEinschlägige Berufserfahrung in Data Science und idealerweise im Umgang mit mehreren der typischen NLP-/LLM-Tools wie OpenAI APIs, Bedrock, Azure AI Foundry, LangChain, LangGraph, Instructor, Hugging Face, Tokenizers, Vektordatenbanken, performanter Inferenz, Model Deployment, MCP/A2A und Datensatz-ErstellungSehr gute Kenntnisse in Machine Learning und Deep Learning, insbesondere im Bezug auf Transformer-Modelle, LLMs und Generative KISicherer Umgang mit produktionsreifen Frameworks wie PyTorch sowie mit Agent-Frameworks wie LangGraph, SmolAgents, OpenAI Agent SDK, CrewAI oder PydanticAITiefes Verständnis der Model-Optimierung mit PEFT wie QLoRA, Instruction Fine-Tuning, Post-Training, Inference-Optimierung und Embeddings Sicher in Workflows wie Conversational AI, RAG, Info Extraction, Tool Calling, LLM-EvaluationKenntnisse in Agentic RAG, GraphRAG, Multi-Agent-Systemen, Text-to-SQL und Code RetrievalSehr gutes Verständnis für Deployments und MLOps auf Azure, GCP oder AWSHoher Anspruch an Softwarequalität und die Fähigkeit, sauberen, performanten, skalierbaren Code zu schreiben und KI-Systeme produktiv einzusetzenDie Fähigkeit, komplexe Anforderungen in technische Lösungen zu übersetzen und auch Fachfremden sicher zu vermitteln sowie gute Deutsch- und Englischkenntnisse Eigenverantwortliches Arbeiten und Mitgestaltungsmöglichkeiten durch kurze EntscheidungswegeWeiterentwicklung durch Fokus auf Innovation: Die vielfältigen und abwechslungsreichen Projekte drehen sich um die Entwicklung intelligenter Algorithmen, datenbasierter Strategien und maßgeschneiderter KI-LösungenEngagiertes, dynamisches, konstruktives Team mit starkem Zusammenhalt und einer offenen Feedback-KulturSehr gut angebundene, moderne Räumlichkeiten und hochwertiges technisches EquipmentFlexibel planbarer remote-Anteil von bis zu 40%, sogar zweitweise aus dem EU-AuslandZuschuss zum Deutschland-Ticket, zu Sport- und Wellnessangeboten sowie zur Kinderbetreuung Gehaltsinformationen Erfahrungsabhängig bis zu 100.000 € p.a.
Mein Arbeitgeber Als erfolgreicher, wachsender mittelständischer Anbieter relevanter und innovativer daten- und KI-getriebener Lösungen zur Optimierung der Entscheidungsfindung für namhafte Unternehmen bietet dieser Arbeitgeber eine aufgeschlossene, agile und von Fortschritt geprägte Kultur, in der Teamwork, Eigenverantwortung und kontinuierliches Lernen von hoher Bedeutung sind Gestaltungsspielraum in anspruchsvollen Projekten, langfristige Perspektiven und Flexibilität sind inklusive Konzeption, Umsetzung und Weiterentwicklung von Generative-AI- und NLP-Systemen, die Kundenanforderungen in Bezug auf Performance, Latenz, Kosten und Erweiterbarkeit optimal erfüllen Regelmäßige Abstimmung mit Stakeholdern bzgl. der Anforderungen Enge Zusammenarbeit mit EntwicklerInnen und Technical Leads für die Umsetzung und Use Cases wie Retrieval-basierten Chatbots, Agentensystemen oder Fine-Tuning von Sprachmodellen Planung und Implementierung robuster Machine Learning-Pipelines nach Best Practices - auf Azure, AWS oder GCP Input bei komplexen technischen Herausforderungen sowie Präsentation von Lösungsansätzen Aktive Verfolgung neuer Entwicklungen in NLP und KI, um KundInnen stets moderne, hochwertige Lösungen anzubieten Erfolgreich abgeschlossenes Studium der (Wirtschafts-)Informatik oder eine vergleichbare Ausbildung Einschlägige Berufserfahrung in Data Science und idealerweise im Umgang mit mehreren der typischen NLP-/LLM-Tools wie OpenAI APIs, Bedrock, Azure AI Foundry, LangChain, LangGraph, Instructor, Hugging Face, Tokenizers, Vektordatenbanken, performanter Inferenz, Model Deployment, MCP/A2A und Datensatz-Erstellung Sehr gute Kenntnisse in Machine Learning und Deep Learning, insbesondere im Bezug auf Transformer-Modelle, LLMs und Generative KI Sicherer Umgang mit produktionsreifen Frameworks wie PyTorch sowie mit Agent-Frameworks wie LangGraph, SmolAgents, OpenAI Agent SDK, CrewAI oder PydanticAI Tiefes Verständnis der Model-Optimierung mit PEFT wie QLoRA, Instruction Fine-Tuning, Post-Training, Inference-Optimierung und Embeddings Sicher in Workflows wie Conversational AI, RAG, Info Extraction, Tool Calling, LLM-Evaluation Kenntnisse in Agentic RAG, GraphRAG, Multi-Agent-Systemen, Text-to-SQL und Code Retrieval Sehr gutes Verständnis für Deployments und MLOps auf Azure, GCP oder AWS Hoher Anspruch an Softwarequalität und die Fähigkeit, sauberen, performanten, skalierbaren Code zu schreiben und KI-Systeme produktiv einzusetzen Die Fähigkeit, komplexe Anforderungen in technische Lösungen zu übersetzen und auch Fachfremden sicher zu vermitteln sowie gute Deutsch- und Englischkenntnisse Eigenverantwortliches Arbeiten und Mitgestaltungsmöglichkeiten durch kurze Entscheidungswege Weiterentwicklung durch Fokus auf Innovation: Die vielfältigen und abwechslungsreichen Projekte drehen sich um die Entwicklung intelligenter Algorithmen, datenbasierter Strategien und maßgeschneiderter KI-Lösungen Engagiertes, dynamisches, konstruktives Team mit starkem Zusammenhalt und einer offenen Feedback-Kultur Sehr gut angebundene, moderne Räumlichkeiten und hochwertiges technisches Equipment Flexibel planbarer remote-Anteil von bis zu 40%, sogar zweitweise aus dem EU-Ausland Zuschuss zum Deutschland-Ticket, zu Sport- und Wellnessangeboten sowie zur Kinderbetreuung Gehaltsinformationen Erfahrungsabhängig bis zu 100.000 € p.a.
Join a sponsor-dedicated team and contribute to the advancement of in-house study activities over time. As the R Programming Lead, you will provide technical expertise to the Statistical Programming team, ensuring the delivery of high-quality solutions that meet both internal and external requirements.
What you will do: Prepare and structure topics, requirements, and stakeholder requests Answer stakeholder questions and create clear, reliable documentation Support and drive new projects from concept phase to implementation Manage and refine incoming requests via the team’s ticketing workflow Derive and define development tasks based on business and data needs Analyze data flows and assess the impact of changes on existing data deliverables Evaluate new data sources and identify opportunities for process optimization Ensure transparency and quality of KPIs and financial data Who you are: 1+ years of experience in a Technical Business Analyst, Data, or similar role Hands-on mentality and strong willingness to learn Experience working with data and IT systems Solid SQL skills (BigQuery is a plus) Understanding of data structures and data flows Basic programming knowledge is a nice to have Additional information: **Working model: Due to the upcoming tasks and responsibilities for this position, it is required to work onsite at our headquarters in Hamburg or Berlin on a weekly basis.
Stakeholder Coordination Support communication between: Sales department IT operations team IT middleware team MS Project Operations implementation partner Translate technical requirements into clear implementation steps. Required Qualifications Proven experience as a Salesforce Consultant with strong technical configuration expertise.
This role supports Pharmacometrics by producing high‑quality, compliant datasets for modeling and analysis, ensuring accuracy from early (unclean) through post‑lock clinical data. The ideal candidate brings deep technical expertise, strong problem‑solving skills, and a solid understanding of PK/PD principles. Key Responsibilities Program, validate, and deliver NONMEM‑ready PK/PD datasets based on SDTM/ADaM standards using advanced R programming skills.Create high‑quality PK/PD datasets for both pre‑lock and post‑lock clinical data.Independently execute programming tasks of medium to high complexity with excellent accuracy and timeliness.Critically review data, identify inconsistencies or gaps, and propose solutions to improve dataset quality and programming efficiency.Perform quality control (QC) of NONMEM datasets, including those produced by external partners.Support preparation of deliverables for regulatory submissions following internal Pharmacometrics guidelines.Conduct QC of customized R packages used for pharmacometrics workflows; enhance or build automated test suites where needed.Liaise with cross‑functional teams including Data Management, Biostatistics, Statistical Programming, and Bioanalytical groups to resolve data issues and ensure alignment.Adhere to relevant SOPs, working instructions, and regulatory standards; maintain inspection readiness.Contribute as a technical driver in the development and improvement of new PM standardization initiatives related to dataset creation and QC.
Stakeholder Coordination Support communication between: Sales department IT operations team IT middleware team MS Project Operations implementation partner Translate technical requirements into clear implementation steps. Required Qualifications Proven experience as a Salesforce Consultant with strong technical configuration expertise.
You’ll work closely with cross-functional teams to enhance user experience and contribute to innovative solutions in a highly technical environment. What You’ll Do Design, implement, and test .NET applications for audio measurement and data analysis Develop features for post-processing of large data sets (millions of data points) Perform calculations based on audio measurement standards Implement remote control functionalities for measurement devices Create user-friendly reports and visualization tools Collaborate with hardware and firmware teams to ensure smooth integration Provide 3rd-level support and interact directly with customers when needed What You Bring Degree in Computer Science, Software Engineering, or a related discipline 3+ years of experience in full-stack software development Strong proficiency in .NET and WPF (C#) Experience with customer-facing desktop applications Analytical mindset and attention to detail Solution-oriented, collaborative, and proactive personality Fluent in English, German knowledge is an advantage What We Offer A diverse and technically challenging position in an international environment Supportive, open, and family-friendly workplace culture Very flexible working hours and a high degree of autonomy Collaboration with a team of passionate and experienced developers Flat hierarchies and an “open-door” mindset that encourages innovation Die DEKRA Arbeit Gruppe ist Teil der DEKRA SE und seit Jahren einer der am stärksten wachsenden Personaldienstleister in ganz Europa.
Ruhr: Strategic, analytical mindset with strong data quality focus Independent, ownership driven working style Clear and structured documentation skills Mentoring and knowledge sharing mindset At least six years' experience in implementation projects, of which at least two years must have been spent in thematically comparable IT projects, Special knowledge in several technical areas, PM: At least two years' experience in IT project management with full management responsibility for the project staff, Management of Cat.
The Oberalp Group is a management driven family business - a house of brands that creates high quality technical mountaineering products. We have 6 own brands Salewa, Dynafit, Pomoca, Wild Country, Evolv and LaMunt. As an exclusive partner of other brands in the sports sector we offer our entire know how in communication, sales and image building.
The Oberalp Group is a management driven family business - a house of brands that creates high quality technical mountaineering products. We have 6 own brands Salewa, Dynafit, Pomoca, Wild Country, Evolv and LaMunt. As an exclusive partner of other brands in the sports sector we offer our entire know how in communication, sales and image building.
Open for change and feedback is what defines our culture.We support you on your journey: individual learning opportunities, world-wide job opportunities or technical training from our academy.The safety and well-being of our employees is important to us, which is why we set high standards for your workplace safety.
SVTs, GSVTs) Elaboration of the operations concept as well as preparation of operational requirements and operational products (Flight & Ground Operations Procedures, i.e. FOPs & GOPs) for the satellite control center Provide technical support to the project managers on planning, contractual changes and risk & opportunities identification Your profile: Completed studies in Computer Science, Electrical Engineering, Telecommunication or comparableSeveral years of work experience in IT / Information Systems and AerospaceExperience in Spacecraft Control Center and Payload Control Center systems engineering Knowledge of satellite control center systems engineering wrt. technical interface, data flow (for user and management data) as well as TM/TC engineering (secure and unsecured) Experience in Ground Station engineering Strong experience in software based systems, networks and IT infrastructure Fluent English and German skills both spoken and written Bewerbung und Rückfragen Wir freuen uns auf Bewerbungen (unter Angabe von ID-Nummer 2120, Verfügbarkeit und Gehaltsvorstellung) gerne per E-Mail an bewerbung.aviation@orizon.de oder über unser Bewerbungsmodul auf dieser Seite.
This position ensures the uninterrupted operation of all systems and responds promptly to technical disruptions and emergencies. Main Tasks and Responsibilities Continuous monitoring of all critical data center infrastructures (power, cooling, electricity, fire protection) Conducting regular inspection rounds and documenting system statuses Responding to alarms and fault messages according to established procedures Performing initial diagnostics for technical problems and initiating appropriate measures Documentation of all incidents, measures, and completed work Support with planned maintenance work and infrastructure projects Ensuring compliance with security guidelines and access controls Communication with customers and external service providers during on-site deployments Processing support requests during the shift Handover of relevant information to subsequent shift teams Requirements Profile Completed training in a technical profession (electrical engineering, IT systems electronics, mechatronics or comparable) Ideally first experience in data center environments or IT infrastructure Basic knowledge in power supply, climate technology, and network technology Understanding of the importance of critical infrastructure and its high availability Technical problem-solving understanding and methodical approach Fluent in German good English Team player with independent working style Resilience and ability to act calmly in emergency situations Basic knowledge in the use of DCIM and ticketing systems Additional Desirable Qualifications Initial experience with data center components such as UPS, HVAC systems, generators Basic knowledge of server and network hardware Understanding of security and compliance requirements in data centers Knowledge of MS Office and monitoring tools Class B driving license Working Environment Shift work according to shift schedule (early/late/night) State-of-the-art data center environment with strict security and quality standards Team-oriented work in a technically demanding environment Opportunity for continuous education and professional development Responsible position with direct influence on the operational security of critical IT infrastructure Additional Information Orientation: Comprehensive training on all systems and emergency procedures Location: On-site –Schwalbach, DEU If this job description resonates with you, we encourage you to apply even if you don’t meet all of the requirements.
Your responsibilities: Quality control and assurance of 3D data quality in the zone/componentQuality control of DMU integration between different work packages/suppliersVerification of 3D data quality for the approval process of the definition dossierManagement of the configuration level of the product structure in accordance with applicable procedures and methodsCentralization of all activities related to linking 3D design data with the upper level of the product structureEnsuring and monitoring the quality of the cDMU (Configuration Digital MockUp)Supporting technical departments in solving configuration problems and controlling all activities related to 3D data exchangeContributing to the establishment and maintenance of the DMU integrator network at suppliersEnsuring and enabling multi-ATA cDMU to harmonize processes, methods, and tools across departments, components, and programs Your profile: Complete studies in engineering, aerospace, mechanical engineering or comparableProfessional experience in configuration and interface management, production planning/control, aircraft constructionBasic knowledge of project management and CIPGood knowledge of 3D measurement methodsProduct lifecycle experienceGood knowledge of digital mockupsConfident handling of SAP, Catia V5, VPM A350 and Google WorkspaceFluent English spoken and writtenKnowledge of German is an advantage Bewerbung und Rückfragen Wir freuen uns auf Bewerbungen (unter Angabe von ID-Nummer 957, Verfügbarkeit und Gehaltsvorstellung) gerne per E-Mail an bewerbung.aviation@orizon.de oder über unser Bewerbungsmodul auf dieser Seite.
Collaboration and Communication: Work closely with stakeholders, including business users, IT teams, and external partners. Communicate technical information to non-technical stakeholders, ensuring clear understanding of application capabilities and limitations. Documentation and Knowledge Management: Develop and maintain documentation for applications, including design documents, user guides, and technical notes.
TypeScript) Solid know how in working with SQL or No-SQL databases You write well-structured, efficient and maintainable code and actively keep the quality of the code base in check You have excellent analytical and problem-solving skills You are used to working in an English-speaking and agile environment You raise the bar of the software you work on You enjoy investigating the ‘why’ behind specific search outcomes and translating those findings into actionable technical enhancements Nice to have Experience with Elasticsearch or OpenSearch Experience building Data Models using dbt Experience in working with a MPP data warehouse (e.g.
TypeScript) Solid know how in working with SQL or No-SQL databases You write well-structured, efficient and maintainable code and actively keep the quality of the code base in check You have excellent analytical and problem-solving skills You are used to working in an English-speaking and agile environment You raise the bar of the software you work on You enjoy investigating the ‘why’ behind specific search outcomes and translating those findings into actionable technical enhancements Nice to have Experience with Elasticsearch or OpenSearch Experience building Data Models using dbt Experience in working with a MPP data warehouse (e.g.
Role Purpose As the Sales Engineer, you have a history of honing and exercising technical, quantitative, and commercial capabilities to achieve material business outcomes. You are obsessed with understanding a client’s business and the best mechanism of extending a platform to help them meet their needs.