1 - 2 of 2 job positions
for keyword Sage
Order by
Please select
- No elements found. Consider changing the search query.
Upload your resume Our AI will read it and recommend you best jobs


Skills:
Compliance, Research, Automation
Job type:
Full-time
Salary:
negotiable
- DataOps, MLOps, and AIOpsDesign, build, and optimize scalable, secure, and efficient data pipelines for AI/ML workflows.
- Automate data ingestion, transformation, and deployment across AWS, GCP, and Azure.
- Implement MLOps and AIOps for model versioning, monitoring, and automated retraining.
- Ensure performance, security, scalability, and cost efficiency in AI lifecycle management.
- Performance Optimization & SecurityMonitor, troubleshoot, and optimize AI/ML pipelines and data workflows to enhance reliability.
- Implement data governance policies, security best practices, and compliance standards.
- Collaborate with cybersecurity teams to address vulnerabilities and ensure data protection.
- Data Engineering & System IntegrationDevelop and manage real-time and batch data pipelines to support AI-driven applications.
- Enable seamless integration of AI/ML solutions with enterprise systems, APIs, and external platforms.
- Ensure data consistency, quality, and lineage tracking across the AI/ML ecosystem.
- AI/ML Model Deployment & OptimizationDeploy and manage AI/ML models in production, ensuring accuracy, scalability, and efficiency.
- Automate model retraining, performance monitoring, and drift detection for continuous improvement.
- Optimize AI workloads for resource efficiency and cost-effectiveness on cloud platforms.
- Continuous Learning & InnovationStay updated on AI/ML advancements, cloud technologies, and big data innovations.
- Contribute to proof-of-concept projects, AI process improvements, and best practices.
- Participate in internal research, knowledge-sharing, and AI governance discussions.
- Cross-Functional Collaboration & Business UnderstandingWork with business teams to ensure AI models align with organizational objectives.
- Gain a basic understanding of how AI/ML supports predictive analytics, demand forecasting, automation, personalization, and content generation.
- Bachelor s degree in Computer Science, Data Engineering, Information Technology, or a related field. Advanced degrees or relevant certifications (e.g., AWS Certified Data Analytics, Google Professional Data Engineer, Azure Data Engineer) are a plus.
- Experience:Minimum of 3-5 years experience in a data engineering or operations role, with a focus on DataOps, MLOps, or AIOps.
- Proven experience managing cloud platforms (AWS, GCP, and/or Azure) in a production environment.
- Hands-on experience with designing, operating, and optimizing data pipelines and AI/ML workflows.
- Technical Skills:Proficiency in scripting languages such as Python and Bash, along with experience using automation tools.
- Familiarity with containerization and orchestration technologies (e.g., Docker, Kubernetes) is desirable.
- Strong knowledge of data processing frameworks (e.g., Apache Spark) and data pipeline automation tools.
- Expertise in data warehouse solutions and emerging data lakehouse architectures.
- Experience with AWS technologies is a plus, especially AWS Redshift and AWS SageMaker, as well as similar tools on other cloud platforms.
- Understanding of machine learning model deployment and monitoring tools.
2 days ago
See morekeyboard_arrow_down
SAVE JOB
UNSAVE JOB
Skills:
node.js, Java, Spring Boot
Job type:
Full-time
Salary:
negotiable
- Develop and maintain robust backend services using Node.js, Nest.js, Java, Spring Boot, Camel, and cloud platforms like AWS and GCP.
- Design and build scalable, event-driven, and failure-tolerant systems. Advocate for and implement best practices in DevSecOps, test-driven development (TDD), and continuous delivery pipelines.
- Collaborate on diverse projects in domains such as Payment, Cart, Fulfillment, Search, and Recommendation.
- Vector Search: Working with vector similarity search to enhance relevance.
- ML Models (XGBoost, CNNs): Applying machine learning models for search relevance and personalization.
- LLMs & PEFT: Fine-tuning large language models using Parameter-Efficient Fine-Tuning (PEFT).
- (These skills are not mandatory but would be considered a strong plus.).
- 7+ years of experience in backend development, focusing on Node.js, Nest.js, Java, Spring Boot, Camel, and cloud platforms like AWS and GCP.
- Strong knowledge of PostgreSQL, Redis, distributed locking mechanisms, functional programming, design patterns, and advanced isolation levels.
- Hands-on experience with REST and GraphQL API development.
- Familiarity with Kafka, SQS, Kubernetes, and containerized application deployment.
- Practical experience with OLAP databases like BigQueryand Redshift, analytics tools such as Mixpanel and Amplitude, and AI platforms like SageMaker, MLflow, and Vertex AI.
- Knowledge of NLP, data structures like graphs, BK Trees, B+ Trees, and the Pub/Sub paradigm.
- Excellent communication, collaboration, and problem-solving skills with a growth-oriented mindset.
- CP AXTRA | Lotus's
- CP AXTRA Public Company Limited.
- Nawamin Office: Buengkum, Bangkok 10230, Thailand.
- By applying for this position, you consent to the collection, use and disclosure of your personal data to us, our recruitment firms and all relevant third parties for the purpose of processing your application for this job position (or any other suitable positions within Lotus's and its subsidiaries, if any). You understand and acknowledge that your personal data will be processed in accordance with the law and our policy.
29 days ago
See morekeyboard_arrow_down
SAVE JOB
UNSAVE JOB
Send me latest jobs forSage
- 1