Data Science Software Engineer, DataRobot

Data Science Software Engineer, DataRobot
Written By:
Srinivas
Reviewed By:
Sankha Ghosh
Published on

DataRobot is looking for a Data Science Software Engineer to build production-ready AI solutions that help organizations around the globe adopt AI/ML at scale. In this role, candidates will develop backend infrastructure, work with data science teams, and implement AI/ML solutions that are resilient, scalable, and maintainable.

Location: Remote (India)

Apply: Click Here

Primary Responsibilities

  • Build and maintain reliable and scalable, production quality backend solutions that serve the AI/ML pipelines of data scientists.

  • Construct architecture, reusable components, for rapid integration of AI capabilities into SAP and other enterprise solutions.

  • Create a data pipeline to connect multiple data sources to ML models to develop analytic and AI solutions.

  • Utilize programming tools and technologies to build AI solutions that are resilient and maintainable while being easy to deploy with as little operational friction as possible.

  • Collaborate with data scientists in order to determine their infrastructure-related needs and help them to translate those needs to recommended technical solutions.

  • Work on engineering problems during high-intensity periods on projects and also contribute to innovation during low-intense periods.

  • Apply, track, and enhance engineering best practices to achieve better performance, scalability, and maintainability of AI/ML solutions.

  • Assist and troubleshoot production systems in order to maintain reliability and performance.

  • Create and implement a CI/CD pipeline to capture the automated testing, build, and deployment process of ML models.

  • Containerize applications and use infrastructure as code to ensure the environment is constant.

  • Monitor system performance and establish a logging strategy for ML applications.

Requirements

  • Over four years of experience with Python for backend services or data science workflows.

  • A strong ability to write efficient, maintainable, and well-structured code regarding reusability and scalability.

  • Experience developing APIs preferably using FastAPI, Flask, or like libraries or frameworks.

  • Experience with containerized technologies (Docker) and orchestration (Kubernetes).

  • Experience with Infrastructure as Code tools (Terraform, Cloudformation, Pulumi).

  • Experience building CI/CD pipelining (CI) tools (Jenkins, GitHub Actions, GitLab CI).

  • Experience with principles of data engineering (ETL, Health Data Pipelines).

  • Ability to work within an environment of varying workloads, and capable and willing to do focused project sprints, and more exploratory work.

  • Highly coachable and eager to learn. Strong problem-solving ability. Can adapt to changing goals or direction.

Nice to Have

  • Experience with cloud-based AI/ML infrastructure (AWS, Azure, GCP).

  • Familiarity with SAP integration technologies or enterprise system integrations.

  • Baseline knowledge of machine learning operations (MLOps) practices.

  • Understanding how to version control data and models.

  • Exposure to generative AI solutions, such as RAG, and fine-tuning, etc.

About DataRobot

DataRobot powers the future of enterprise AI, allowing organizations to transform data into real-time actionable insights. DataRobot’s platform combines automation of machine learning and generative AI workflows, enabling organizations to make smarter decisions, reduce risk, and provide actual, real-world impact. DataRobot enables AI to be applied in practice at scale and is accessible to all teams and industries.

Related Stories

No stories found.
logo
Analytics Insight: Latest AI, Crypto, Tech News & Analysis
www.analyticsinsight.net