- No elements found. Consider changing the search query.


Skills:
ETL, Compliance, SQL
Job type:
Full-time
Salary:
negotiable
- Design, develop, and maintain data pipelines to extract, transform, and load (ETL) data from various sources into a centralized data warehouse or data lake.
- Integrate data from different sources, such as databases, APIs, and third-party applications, ensuring data consistency and accuracy.
- Create and maintain data models and schemas to facilitate data storage and retrieval, following best practices for data warehousing and database management.
- Implement data quality checks and validation processes to ensure data accuracy, completeness, and consistency.
- Optimize data pipelines and systems for performance, scalability, and efficiency, making sure data processing meets business requirements.
- Implement data security measures to protect sensitive information and ensure compliance with data privacy regulations (e.g., GDPR, HIPAA).
- Document data engineering processes, data lineage, and system architecture to facilitate knowledge sharing and future maintenance.
- Set up monitoring and alerting systems to detect and address issues with data pipelines and systems proactively.
- Collaborate with cross-functional teams, including data scientists, analysts, and software engineers, to understand their data needs and provide the necessary infrastructure and support.
- Stay up-to-date with the latest data engineering technologies and tools, and evaluate their applicability to the organization's data stack.
- Bachelor's degree in computer science, information technology, or a related field. A master's degree is a plus.
- Strong proficiency in data engineering tools and technologies, such as SQL, ETL frameworks (e.g., Apache Spark, Apache Airflow, Apache Beam).
- Experience with Google BigQuery for data warehousing.
- Experience with Google Cloud Dataflow or Dataproc.
- Experience with programming languages like Python, Java, or Scala.
- Experience with building streaming data pipelines on Google Cloud.
- Knowledge of database design, data modeling, and data integration techniques.
- Familiarity with data governance, data security, and compliance standards.
- Problem-solving skills and attention to detail.
- Strong communication and collaboration skills.
Skills:
Research, Automation, Statistics
Job type:
Full-time
Salary:
negotiable
- Work on Data Architecture. They use a systematic approach to plan, create, and maintain data architectures while also keeping it aligned with business requirements.
- Collect Data. Before initiating any work on the database, they have to obtain data from the right sources. After formulating a set of dataset processes, data engineers store optimized data.
- Conduct Research. Data engineers conduct research in the industry to address any issues that can arise while tackling a business problem.
- Automate Tasks. Data engineers dive into data and pinpoint tasks where manual participation can be eliminated with automation.
- Automate Tasks. Data engineers dive into data and pinpoint tasks where manual participation can be eliminated with automation.
- Bachelor Degree in IT, computer science, statistics, mathematics, business, or related field.
- Minimum of 3 years' experience in data engineer roles.
- Experience in the data analytics lifecycle including problem identification, measurement/matrix, exploratory data analysis and data insight presentation.
- Experience with data tools and languages like CLOUD, Python, Java similar.
- Experience with data processing, ETL and workflow and messaging queue like Kafka.
- Data Warehousing.
- Data Structure.
- ETL Tools Programming Languages (Python, Java, Pyspark).
Skills:
Compliance, Research, Automation
Job type:
Full-time
Salary:
negotiable
- DataOps, MLOps, and AIOpsDesign, build, and optimize scalable, secure, and efficient data pipelines for AI/ML workflows.
- Automate data ingestion, transformation, and deployment across AWS, GCP, and Azure.
- Implement MLOps and AIOps for model versioning, monitoring, and automated retraining.
- Ensure performance, security, scalability, and cost efficiency in AI lifecycle management.
- Performance Optimization & SecurityMonitor, troubleshoot, and optimize AI/ML pipelines and data workflows to enhance reliability.
- Implement data governance policies, security best practices, and compliance standards.
- Collaborate with cybersecurity teams to address vulnerabilities and ensure data protection.
- Data Engineering & System IntegrationDevelop and manage real-time and batch data pipelines to support AI-driven applications.
- Enable seamless integration of AI/ML solutions with enterprise systems, APIs, and external platforms.
- Ensure data consistency, quality, and lineage tracking across the AI/ML ecosystem.
- AI/ML Model Deployment & OptimizationDeploy and manage AI/ML models in production, ensuring accuracy, scalability, and efficiency.
- Automate model retraining, performance monitoring, and drift detection for continuous improvement.
- Optimize AI workloads for resource efficiency and cost-effectiveness on cloud platforms.
- Continuous Learning & InnovationStay updated on AI/ML advancements, cloud technologies, and big data innovations.
- Contribute to proof-of-concept projects, AI process improvements, and best practices.
- Participate in internal research, knowledge-sharing, and AI governance discussions.
- Cross-Functional Collaboration & Business UnderstandingWork with business teams to ensure AI models align with organizational objectives.
- Gain a basic understanding of how AI/ML supports predictive analytics, demand forecasting, automation, personalization, and content generation.
- Bachelor s degree in Computer Science, Data Engineering, Information Technology, or a related field. Advanced degrees or relevant certifications (e.g., AWS Certified Data Analytics, Google Professional Data Engineer, Azure Data Engineer) are a plus.
- Experience:Minimum of 3-5 years experience in a data engineering or operations role, with a focus on DataOps, MLOps, or AIOps.
- Proven experience managing cloud platforms (AWS, GCP, and/or Azure) in a production environment.
- Hands-on experience with designing, operating, and optimizing data pipelines and AI/ML workflows.
- Technical Skills:Proficiency in scripting languages such as Python and Bash, along with experience using automation tools.
- Familiarity with containerization and orchestration technologies (e.g., Docker, Kubernetes) is desirable.
- Strong knowledge of data processing frameworks (e.g., Apache Spark) and data pipeline automation tools.
- Expertise in data warehouse solutions and emerging data lakehouse architectures.
- Experience with AWS technologies is a plus, especially AWS Redshift and AWS SageMaker, as well as similar tools on other cloud platforms.
- Understanding of machine learning model deployment and monitoring tools.
Experience:
2 years required
Skills:
Research, Python, SQL
Job type:
Full-time
Salary:
negotiable
- Develop machine learning models such as credit model, income estimation model and fraud model.
- Research on cutting-edge technology to enhance existing model performance.
- Explore and conduct feature engineering on existing data set (telco data, retail store data, loan approval data).
- Develop sentimental analysis model in order to support collection strategy.
- Bachelor Degree in Computer Science, Operations Research, Engineering, or related quantitative discipline.
- 2-5 years of experiences in programming languages such as Python, SQL or Scala.
- 5+ years of hands-on experience in building & implementing AI/ML solutions for senior role.
- Experience with python libraries - Numpy, scikit-learn, OpenCV, Tensorflow, Pytorch, Flask, Django.
- Experience with source version control (Git, Bitbucket).
- Proven knowledge on Rest API, Docker, Google Big Query, VScode.
- Strong analytical skills and data-driven thinking.
- Strong understanding of quantitative analysis methods in relation to financial institutions.
- Ability to clearly communicate modeling results to a wide range of audiences.
- Nice to have.
- Experience in image processing or natural language processing (NLP).
- Solid understanding in collection model.
- Familiar with MLOps concepts.
Skills:
Big Data, ETL, SQL
Job type:
Full-time
Salary:
negotiable
- Develop and maintain robust data pipelines to ingest, process, and transform raw data into formats suitable for LLM training.
- Conduct meeting with users to understand the data requirements and perform database design based on data understanding and requirements with consideration for performance.
- Maintain data dictionary, relationship and its interpretation.
- Analyze problem and find resolution, as well as work closely with administrators to monitor performance and advise any necessary infrastructure changes.
- Work with business domain experts, data scientists and application developers to identify data that is relevant for analysis.
- Develop big data solutions for batch processing and near real-time streaming.
- Own end-to-end data ETL/ELT process framework from Data Source to Data warehouse.
- Select and integrate appropriate tools and frameworks required to provide requested capabilities.
- Design and develop BI solutions.
- Hands-on development mentality, with a willingness to troubleshoot and solve complex problems.
- Keep abreast of new developments in the big data ecosystem and learn new technologies.
- Ability to effectively work independently and handle multiple priorities.
- Bachelor degree or higher in Computer Science, Computer Engineering, Information Technology, Management Information System or an IT related field.
- 3+ year's experiences in Data Management or Data Engineer (Retail or E-Commerce business is preferrable).
- Expert experience in query language (SQL), Databrick SQL, PostgreSQL.
- Experience in Big Data Technologies like Hadoop, Apache Spark, Databrick.
- Experience in Python is a must.
- Experience in Generative AI is a must.
- Knowledge in machine/statistical learning, data mining is a plus.
- Strong analytical, problem solving, communication and interpersonal skills.
- Having good attitude toward team working and willing to work hard.
- CP AXTRA | Lotus's
- CP AXTRA Public Company Limited.
- Nawamin Office: Buengkum, Bangkok 10230, Thailand.
- By applying for this position, you consent to the collection, use and disclosure of your personal data to us, our recruitment firms and all relevant third parties for the purpose of processing your application for this job position (or any other suitable positions within Lotus's and its subsidiaries, if any). You understand and acknowledge that your personal data will be processed in accordance with the law and our policy. .
Skills:
Finance, Usability Testing, Industry trends
Job type:
Full-time
Salary:
negotiable
- Advanced Requirement Gathering & Analysis:Engage with senior stakeholders across diverse business units (e.g., marketing, sales, operations, and finance) to extract and refine detailed business requirements.
- Lead process mapping initiatives and perform in-depth analysis of current workflows to identify strategic opportunities for system enhancements and digital transformation.
- Documentation & Functional Specification Leadership:Develop and oversee the creation ...
- Ensure documentation accuracy and consistency across projects, aligning technical details with overall business strategy.
- Business Case Development:Create robust business cases for technology projects, including detailed cost-benefit analyses, ROI assessments, and risk evaluations to support strategic decision-making.
- Project Development & Vendor Management:Lead project development activities, managing the RFI/RFP processes and evaluating vendor proposals to select the best-fit partners.
- Oversee vendor negotiations and ensure external solutions align with our business requirements.
- System Integration & Process Strategy:Evaluate existing systems and integration points, providing recommendations for improvements that optimize performance and enhance user experience.
- Lead system integration initiatives, ensuring that new and existing applications communicate effectively and that data flows seamlessly across platforms.
- Serve as a subject matter expert in systems analysis and process re-engineering, ensuring solutions align with both short-term needs and long-term strategic goals.
- User Experience (UX) Optimization:Take a hands-on approach to UX design, including prototyping, conducting usability testing, and implementing design improvements independently when needed.
- Collaborate with product teams to analyze and improve user interactions with both internal and external systems, ensuring intuitive, efficient, and user-centric designs.
- Leadership & Mentorship:Lead cross-functional meetings and workshops to build consensus and drive project initiatives.
- Provide guidance and mentorship to junior peers, fostering a culture of continuous improvement and professional development within the team.
- Collaboration & Stakeholder Management:Act as the primary liaison between business leaders, IT teams, and external vendors, ensuring seamless communication, timely delivery, and successful implementation of projects.
- Drive the prioritization of initiatives and manage expectations across multiple stakeholders.
- Continuous Improvement & Innovation:Monitor and evaluate system performance post-implementation, gathering feedback and proactively identifying further opportunities for enhancement.
- Stay current with industry trends and emerging technologies, incorporating innovative practices into business processes and system design.
- Bachelor s degree in Business Administration, Information Technology, Computer Science, or a related field. An advanced degree or relevant certifications is highly desirable.
- Experience:Minimum of 5+ years experience in a Business Analyst, Systems Analyst, or a similar role within a dynamic, multi-domain business environment.
- Demonstrated experience in leading digital transformation projects, developing detailed Functional Requirement Documents, and working within agile or iterative development frameworks.
- Proven track record of managing complex system integrations, vendor evaluations (including RFI/RFP processes), and overall project management.
- Technical Skills:Expertise in process mapping and documentation tools (e.g., Lucidchart, Figma or similar).
- In-depth understanding of software development lifecycles, database concepts, and integration methodologies.
- Strong familiarity with UX design principles and the ability to collaborate effectively with design teams.
- Product Management Knowledge: Understanding of product management processes, lifecycle, and strategies to align technical solutions with business outcomes.
- Cloud Platforms: Practical knowledge of cloud environments (e.g., AWS, Google Cloud, and Azure) to support scalable and flexible technology solutions.
- System Integration Expertise: Experience in designing and implementing system integrations, ensuring seamless communication between disparate systems and optimizing data flow.
- Soft Skills:Exceptional communication, presentation, and interpersonal skills, with the ability to interact confidently with senior management and technical teams.
- Superior analytical and problem-solving skills with a keen attention to detail.
- Proven leadership abilities, including project management, team mentoring, and stakeholder coordination.
- A proactive, strategic mindset with the ability to drive initiatives and manage multiple priorities in a fast-paced environment.
Skills:
ETL, Big Data, SQL
Job type:
Full-time
Salary:
negotiable
- Design and develop ETL solutions using data integration tools for interfacing between source application and the Enterprise Data Warehouse.
- Experience with Big Data or data warehouse.
- Analyze & translate functional specifications & change requests into technical specifications.
- Experience in SQL programming in one of these RDBMSs such as Oracle.
- Develops ETL technical specifications, designs, develops, tests, implements, and supports optimal data solutions.
- Develops Documents ETL data mappings, data dictionaries, processes, programs and solutions as per established standards for data governance.
- Design and create codes for all related data extraction, transformation and loading (ETL) into database under responsibilities.
- Creates, executes, and documents unit test plans for ETL and data integration processes and programs.
- Perform problem assessment, resolution and documentation in existing ETL packages, mapping and workflows in production.
- Performance tuning of the ETL process and SQL queries and recommend and implement ETL and query tuning techniques.
- Qualifications Bachelor s degree in Information Technology, Computer Science, Computer Engineering, or a related field.
- Experience with CI/CD tools (e.g., Jenkins, GitLab CI/CD) and automation frameworks.
- Proficiency in cloud platforms such as AWS, Azure, and associated services.
- Knowledge of IAC tools like Terraform or Azure ARM.
- Familiarity with monitoring and logging tools like Prometheus, Grafana, ELK stack, APM, etc.
- Good understanding of IT Operations.
- Strong problem-solving skills and the ability to troubleshoot complex issues.
- Excellent communication and teamwork skills to collaborate effectively across various teams.
- We're committed to bringing passion and customer focus to the business. If you like wild growth and working with happy, enthusiastic over-achievers, you'll enjoy your career with us.
Experience:
5 years required
Skills:
AutoCAD, Visio, English
Job type:
Full-time
Salary:
negotiable
- Responsible for planning preventive maintenance schedules for the electrical system.
- Responsible for coordinating and managing vendors and suppliers to preventive maintenance and payment plans.
- 2nd Level support to Data Center Operation (FOC), on site to solve Incident and Problem management.
- 2nd Level support to engineer team all site, Data Center (TT1, TT2, MTG, BNA).
- To create & update reports and documents to comply with ISO 20k, 22k, 27k, 50k & TCOS standards.
- Review PUE, cost saving energy and report.
- Measured Efficiency air system and record annual report.
- Responsible for implementing Electrical such as MU, TR, MDB, GEN, UPS, RECT, BATT, ATS.
- Bachelor degree of Engineering, Electrical engineering or related field.
- More than 5 years of experience in maintenance of electrical systems such as RMU, TR, MDB, GEN, UPS, RECT, BATT, ATS: implement and support electrical systems in buildings or Data Centers.
- At least 1 years experience in designing electrical systems (such as RMU, TR, MDB, GEN, UPS, RECT, BATT, ATS). implement, and support for electrical systems in building.
- Able to use the program AutoCAD, Visio.
- Able to work as a team and work in and standby on call on holiday.
- Able to work overtime if required and a hotline arrives (Less than 1 hour on site from your home).
- Proficiency in English communication is beneficial for both reading and writing.
- Work Location: TrueIDC - Bangna Site (KM26).
Skills:
Data Analysis, SQL, Problem Solving, English
Job type:
Full-time
Salary:
negotiable
- Working closely with business and technical domain experts to identify data requirements that are relevant for analytics and business intelligence.
- Implement data solutions and data comprehensiveness for data customers.
- Working closely with engineering to ensure data service solutions are ultimately delivered in a timely and cost effective manner.
- Retrieve and prepare data (automated if possible) to support business data analysis.
- Ensure adherence to the highest standards in data privacy protection and data governance.
- Bachelor s of Master s Degree in Computer Science, Computer Engineering, or related.
- Minimum of 1 Year with relational/non-relational database systems and good command in SQL.
- Ability to meet critical deadlines and prioritize multiple tasks in a fast-paced environment.
- Ability to work independently, have strong problem solving and organization skills, with a high initiative and a sense of accountability and ownership.
- Experience with cloud-based platforms such as AWS, Google Cloud platform or similar.
- Experience in data processing batch / real time / nearly realtime.
- Experience with data integration or ETL management tools such as AWS Glue, Databrick,or similar.
- Experience with web or software development with Java,Python or similar.
- Experience with Agile methodology is a plus.
- Good in communication and writing in English.
- Good interpersonal and communication skills.
Experience:
3 years required
Skills:
English
Job type:
Full-time
Salary:
negotiable
- Responsible for planning preventive maintenance schedules for air condition & Fire protection systems.
- Responsible for coordinating and managing vendors and suppliers to preventive maintenance and payment plans.
- 2nd Level support to Data Center Operation (FOC), on site to solve Incident and Problem management.
- 2nd Level support to engineer team all site, Data Center (TT1, TT2, MTG, BNA).
- To create & update reports and documents to comply with ISO 20k, 22k, 27k, 50k & TCOS standards.
- Review PUE, cost saving energy and report.
- Measured Efficiency air system and record annual report.
- Responsible for implementation of Mechanical such as comfort air, precision air.
- Responsible for implementation of Fire suppression such as FM200, NOVEC, CO2, Fire Sprinkler, Fire Pump, Fire alarm, and Vesda.
- Working period office time 9:00 - 18:00 and able to standby on call or onsite to work on holiday.
- Bachelor degree of Engineering, Mechanical engineering or related field.
- At Least 3 years experience in maintenance air conditioning such as comfort air, precision air, chiller air cool, water cool, pump motor: implement and support for mechanics-air condition systems in buildings or Data Centers.
- At least 1 years experience in designing air conditioners such as comfort air, precision air, chiller air cool, water cool, pump motor: implement, and support for mechanics-air condition systems in building.
- Able to Air - Diagram and Psychrometric chart knowledge.
- Able to work as a team and work in and standby on call on holiday.
- Able to work overtime if required and a hotline arrives (Less than 1 hour on site from your home).
- Proficiency in English communication is beneficial.
- Work Location: TrueIDC - Bangna Site (KM26).
Experience:
2 years required
Skills:
Social media, PHP, Python
Job type:
Full-time
Salary:
negotiable
- Department: Information Technology.
- Company: บริษัท จีเอ็มเอ็ม มิวสิค จำกัด (มหาชน).
- Develop and maintain efficient and scalable data collection services from various sources.
- Monitor all job schedules within data pipelines to ensure smooth operations.
- Ensure data quality and integrity across various data sources.
- Enhance existing services to be more flexible and adaptable to data changes.
- Provide alerts and manage issues that may arise in the scheduling process to ensure continuous and efficient services performance.
- Ensure the APIs connection of Social Media and Third Party APIs.
- Working closely with data scientists and data analysts in order to generate business impactful masterpiece..
- 2+ years of experience with HTML/CSS/JS/PHP/Python/SQL/NodeJS and Other Back-End Technologies.
- New graduates with strong specific skill are welcome.
- Experience with cloud-based platforms such as AWS, Google Cloud platform or similar.
- Advanced in Python (Selenium) and SQL.
- Basic understanding of job schedulers (Cron) and Linux environments.
- Systematic thinking, automation finder, change management and always enhancing things are a crucial personality/mindset for this position..
- ประสบการณ์ 2 ปีขึ้นไป.
- จำนวน 1 อัตรา.
Experience:
5 years required
Job type:
Full-time
Salary:
negotiable
- Be focal point and accountable person to cater engineering digital solutions in each of key focus areas.
- Provide implementation of engineering digital services through a range of digital methodologies, solutions.
- Develop standard, guideline leveraging from engineering data best practice e.g. CFIHOS.etc.
- Analyze as-is and to-be engineering processes and further define system / technology / business requirements.
- Capture and translate of business requirements to digitally enabled solutions.
- Ensure engineering data in specific focus areas are managed in concept of single source of truth, accepted in term of quality and controlled under appropriate user roles.
- Professional Knowledge & Experiences.
- Bachelor s or higher in Mechanic, Instrumental, Electrical or other engineering related field.
- Minimum 5 years experience involving in Engineering, Construction, Maintenance or Operation, preferably in the oil and gas or a similar industry.
- Understand overall typical engineering & construction development processes, basic understanding of asset life cycle.
- Possess data and information skill or be in the position of using, managing & controlling data.
- Having experience of key user roles involving in software / application development, understand how business requirements are translated to software requirements.
- Understand IT basic knowledge e.g. UI/UX design, Application Programming Interface, Database.etc.
- Having interest to learn in new digital solutions or technologies.
Skills:
Big Data, Java, Python
Job type:
Full-time
Salary:
negotiable
- Background in programming, databases and/or big data technologies OR.
- BS/MS in software engineering, computer science, economics or other engineering fields.
- Partner with Data Architect and Data Integration Engineer to enhance/maintain optimal data pipeline architecture aligned to published standards.
- Assemble medium, complex data sets that meet functional /non-functional business requirements.
- Design and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
- Build the infrastructure required for optimal extraction transformation, and loading of data from a wide variety of data sources big data technologies.
- Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics.
- Work with stakeholders including Domain leads, and Teams to assist with data-related technical issues and support their data infrastructure needs.
- Ensure technology footprint adheres to data security policies and procedures related to encryption, obfuscation and role based access.
- Create data tools for analytics and data scientist team members.
- Functional Competency.
- Knowledge of data and analytics framework supporting data lakes, warehouses, marts, reporting, etc.
- Defining data retention policies, monitoring performance and advising any necessary infrastructure changes based on functional and non-functional requirements.
- In depth knowledge of data engineering discipline.
- Extensive experience working with Big Data tools and building data solutions for advanced analytics.
- Minimum of 5+ years' hands-on experience with a strong data background.
- Solid programming skills in Java, Python and SQL.
- Clear hands-on experience with database systems - Hadoop ecosystem, Cloud technologies (e.g. AWS, Azure, Google), in-memory database systems (e.g. HANA, Hazel cast, etc) and other database systems - traditional RDBMS (e.g. Teradata, SQL Server, Oracle), and NoSQL databases (e.g. Cassandra, MongoDB, DynamoDB).
- Practical knowledge across data extraction and transformation tools - traditional ETL tools (e.g. Informatica, Ab Initio, Altryx) as well as more recent big data tools.
- Educational.
- Background in programming, databases and/or big data technologies OR.
- BS/MS in software engineering, computer science, economics or other engineering fields.
- Partner with Data Architect and Data Integration Engineer to enhance/maintain optimal data pipeline architecture aligned to published standards.
- Assemble medium, complex data sets that meet functional /non-functional business requirements.
- Design and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
- Build the infrastructure required for optimal extraction transformation, and loading of data from a wide variety of data sources big data technologies.
- Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics.
- Work with stakeholders including Domain leads, and Teams to assist with data-related technical issues and support their data infrastructure needs.
- Ensure technology footprint adheres to data security policies and procedures related to encryption, obfuscation and role based access.
- Create data tools for analytics and data scientist team members.
- Functional Competency.
- Knowledge of data and analytics framework supporting data lakes, warehouses, marts, reporting, etc.
- Defining data retention policies, monitoring performance and advising any necessary infrastructure changes based on functional and non-functional requirements.
- In depth knowledge of data engineering discipline.
- Extensive experience working with Big Data tools and building data solutions for advanced analytics.
- Minimum of 5+ years' hands-on experience with a strong data background.
- Solid programming skills in Java, Python and SQL.
- Clear hands-on experience with database systems - Hadoop ecosystem, Cloud technologies (e.g. AWS, Azure, Google), in-memory database systems (e.g. HANA, Hazel cast, etc) and other database systems - traditional RDBMS (e.g. Teradata, SQL Server, Oracle), and NoSQL databases (e.g. Cassandra, MongoDB, DynamoDB).
- Practical knowledge across data extraction and transformation tools - traditional ETL tools (e.g. Informatica, Ab Initio, Altryx) as well as more recent big data tools.
Experience:
5 years required
Skills:
Python, ETL, Compliance
Job type:
Full-time
Salary:
negotiable
- Design and implement scalable, reliable, and efficient data pipelines for ingesting, processing, and storing large amounts of data from a variety of sources using cloud-based technologies, Python, and PySpark.
- Build and maintain data lakes, data warehouses, and other data storage and processing systems on the cloud.
- Write and maintain ETL/ELT jobs and data integration scripts to ensure smooth and accurate data flow.
- Implement data security and compliance measures to protect data privacy and ensure regulatory compliance.
- Collaborate with data scientists and analysts to understand their data needs and provide them with access to the required data.
- Stay up-to-date on the latest developments in cloud-based data engineering, particularly in the context of Azure, AWS and GCP, and proactively bring new ideas and technologies to the team.
- Monitor and optimize the performance of data pipelines and systems, identifying and resolving any issues or bottlenecks that may arise.
- Bachelor s or Master s degree in Computer Science, Data Science, or a related field.
- Minimum of 5 years of experience as a Data Engineer, with a strong focus on cloud-based data infrastructure.
- Proficient programming skills in Python, Java, or a similar language, with an emphasis on Python.
- Extensive experience with cloud-based data storage and processing technologies, particularly Azure, AWS and GCP.
- Familiarity with ETL/ELT tools and frameworks such as Apache Beam, Apache Spark, or Apache Flink.
- Knowledge of data modeling principles and experience working with SQL databases.
- Strong problem-solving skills and the ability to troubleshoot and resolve issues efficiently.
- Excellent communication and collaboration skills to work effectively with cross-functional teams.
- Location: True Digital Park, Bangkok (Hybrid working).
Skills:
ETL, Big Data, SQL
Job type:
Full-time
Salary:
negotiable
- Design and develop ETL solutions using data integration tools for interfacing between source application and the Enterprise Data Warehouse.
- Experience with Big Data or data warehouse.
- Analyze & translate functional specifications & change requests into technical specifications.
- Experience in SQL programming in one of these RDBMSs such as Oracle.
- Develops ETL technical specifications, designs, develops, tests, implements, and supports optimal data solutions.
- Develops Documents ETL data mappings, data dictionaries, processes, programs and solutions as per established standards for data governance.
- Design and create codes for all related data extraction, transformation and loading (ETL) into database under responsibilities.
- Creates, executes, and documents unit test plans for ETL and data integration processes and programs.
- Perform problem assessment, resolution and documentation in existing ETL packages, mapping and workflows in production.
- Performance tuning of the ETL process and SQL queries and recommend and implement ETL and query tuning techniques.
- Qualifications Bachelor s degree in Information Technology, Computer Science, Computer Engineering, or a related field.
- Experience with CI/CD tools (e.g., Jenkins, GitLab CI/CD) and automation frameworks.
- Proficiency in cloud platforms such as AWS, Azure, and associated services.
- Knowledge of IAC tools like Terraform or Azure ARM.
- Familiarity with monitoring and logging tools like Prometheus, Grafana, ELK stack, APM, etc.
- Good understanding of IT Operations.
- Strong problem-solving skills and the ability to troubleshoot complex issues.
- Excellent communication and teamwork skills to collaborate effectively across various teams.
- We're committed to bringing passion and customer focus to the business. If you like wild growth and working with happy, enthusiastic over-achievers, you'll enjoy your career with us.
Experience:
3 years required
Skills:
Kubernetes, Automation, Redis
Job type:
Full-time
Salary:
negotiable
- Platform Operations: Manage and operate our Kubernetes platform, ensuring high availability, performance, and security.
- Automation & Tooling: Design, develop, and implement automation solutions for operational tasks, infrastructure provisioning, and application deployment.
- Observability: Build and maintain a comprehensive observability stack (monitoring, logging,tracing) to proactively identify and resolve issues.
- Platform Stability & Performance: Implement and maintain proactive measures to ensure platform stability, performance optimization, and capacity planning.
- Middleware Expertise: Provide support and expertise for critical middleware tools such as RabbitMQ, Redis, and Kafka, ensuring their optimal performance and reliability.
- Incident Response: Participate in our on-call rotation, troubleshoot and resolve production incidents efficiently, and implement preventative measures.
- Collaboration: Collaborate effectively with development and other engineering teams.
- Positive attitude and empathy for others.
- Passion for developing and maintaining reliable, scalable infrastructure.
- A minimum of 3 years working experience in relevant areas.
- Experience in managing and operating Kubernetes in a production environment.
- Experienced with cloud platforms like AWS or GCP.
- Experienced with high availability, high-scale, and performance systems.
- Understanding of cloud-native architectures.
- Experienced with DevSecOps practices.
- Strong scripting and automation skills using languages like Python, Bash, or Go.
- Proven experience in building and maintaining CI/CD pipelines (e.g., Jenkins, GitLab CI).
- Deep understanding of monitoring, logging, and tracing tools and techniques.
- Experience with infrastructure-as-code tools (e.g., Terraform, Ansible).
- Strong understanding of Linux systems administration and networking concepts.
- Experience working with middleware technologies like RabbitMQ, Redis, and Kafka.
- Excellent problem-solving and troubleshooting skills.
- Excellent communication and collaboration skills.
- Strong interest and ability to learn any new technical topic.
- Experience with container security best practices.
- Experience with chaos engineering principles and practices.
- Experience in the Financial Services industry.
- Opportunity to tackle challenging projects in a dynamic environment.
Experience:
6 years required
Skills:
Compliance, Legal
Job type:
Full-time
Salary:
negotiable
- Lead the team to handle all quality excursions independently, and take effective actions on time.
- Oversee the product & material quality from NPI to mass production.
- Develop the quality control plan for respective area (material/process/product).
- Monitor and report quality KPI for internal (factory) and external (customer).
- Drive continuous improvement to benefit customer, CLS and supplier.
- Develop and maintain internal quality system, procedures, work instructions and workmanship standards.
- Lead process/product/system/supplier audit and improvement actions follow up.Industry standards (e.g. ISO).
- Compliance audit.
- Safety audit etc.
- Follow up the EC(Engineering Change), SPCN(Supplier Process Change Notification) to ensure the implementation of changes is timely and accurate (initiated both externally and internally).
- Coach junior quality staff to improve their quality knowledge.
- Accomplish the jobs assignment from the superior and participate in the quality strategy deployment.
- Knowledge/Skills/Competencies.
- Strong knowledge of quality tools, ISO and IPC standards and processes.
- Knowledge of software and its uses in generating reports capturing data presenting data in an understandable format.
- Strong knowledge of product and manufacturing processes.
- Knowledge and understanding of the business unit and how decisions impact customer satisfaction product quality,on-time delivery and profitability of the unit.
- Knowledge of quality tools such as FMEA, PMP, SPC, 8D methodology, etc.
- Knowledge of Six sigma and Lean Kaizen.
- Ability to effectively communicate with a wide variety of internal and external customers.
- Physical Demands.
- Duties of this position are performed in a normal office environment.
- Duties may require repetitive manual movements (e.g., keyboarding), carrying, pushing or pulling light objects, (under 5 kg.), carrying, pushing or pulling heavy objects (over 5 kg.), crouching, climbing.
- Sustained visual concentration on small areas, such as monitors, screens, precise eye/hand coordination, sustained visual concentration on numbers, legal documents.
- Typical Experience.
- 4 to 6 years in a similar role or industry.
- Typical Education.
- Bachelor's degree in related field, or consideration of an equivalent combination of education and experiency.
- Educational requirements may vary by geography.
- Notes.
- This job description is not intended to be an exhaustive list of all duties and responsibilities of the position. Employees are held accountable for all duties of the job. Job duties and the % of time identified for any function are subject to change at any time.
- Celestica is an equal opportunity employer. All qualified applicants will receive consideration for employment and will not be discriminated against on any protected status (including race, religion, national origin, gender, sexual orientation, age, marital status, veteran or disability status or other characteristics protected by law).
- At Celestica we are committed to fostering an inclusive, accessible environment, where all employees and customers feel valued, respected and supported. Special arrangements can be made for candidates who need it throughout the hiring process. Please indicate your needs and we will work with you to meet them.
- Celestica (NYSE, TSX: CLS) enables the world s best brands. Through our recognized customer-centric approach, we partner with leading companies in Aerospace and Defense, Communications, Enterprise, HealthTech, Industrial, Capital Equipment and Energy to deliver solutions for their most complex challenges. As a leader in design, manufacturing, hardware platform and supply chain solutions, Celestica brings global expertise and insight at every stage of product development - from drawing board to full-scale production and after-market services for products from advanced medical devices, to highly engineered aviation systems, to next-generation hardware platform solutions for the Cloud. Headquartered in Toronto, with talented teams spanning 40+ locations in 13 countries across the Americas, Europe and Asia, we imagine, develop and deliver a better future with our customers.
- Celestica would like to thank all applicants, however, only qualified applicants will be contacted.
- Celestica does not accept unsolicited resumes from recruitment agencies or fee based recruitment services.
Experience:
5 years required
Skills:
Research, Statistics, Finance, English
Job type:
Full-time
Salary:
negotiable
- Develop, maintain, and calibrate existing quantitative risk models, including provisioning models and credit scoring tailored to various portfolio types and financial institutions.
- Perform both conceptual and quantitative reviews of models, including validation, using programming scripts or automated tools.
- Provide business insights on post-model adjustments, such as management overlays.
- Research risk management topics and stay updated on recent industry developments.
- Prepare comprehensive model documentation, reports, or presentations to communicate methodologies and results to clients.
- Effectively convey observations, results, thoughts, and initiatives to client stakeholders in both Thai and English through proficient presentation during virtual and in-person meetings as needed.
- Propose innovative ideas to enhance team efficiency and effectiveness.
- Collaborate with colleagues and clients across multiple countries, primarily within Southeast Asia.
- Support partners and directors in preparing client proposals under tight deadlines.
- Mentor and onboard junior staff, ensuring the delivery of high-quality work.
- You will be expected to communicate closely with senior management and client personnel; assist in proposal development; mentor and develop junior team members; and maintain up-to-date knowledge of financial risk management methodologies, current corporate governance and regulatory developments/requirements, both locally and internationally
- Your role as a leader:At Deloitte, we believe in the importance of empowering our people to be leaders at all levels. We connect our purpose and shared values to identify issues as well as to make an impact that matters to our clients, people and the communities. Additionally, Senior Associates / Senior Consultants / Assistant Managers across our Firm are expected to:Actively seek out developmental opportunities for growth, act as strong brand ambassadors for the firm as well as share their knowledge and experience with others.
- Respect the needs of their colleagues and build up cooperative relationships.
- Understand the goals of our internal and external stakeholder to set personal priorities as well as align their teams work to achieve the objectives.
- Constantly challenge themselves, collaborate with others to deliver on tasks and take accountability for the results.
- Build productive relationships and communicate effectively in order to positively influence teams and other stakeholders.
- Offer insights based on a solid understanding of what makes Deloitte successful.
- Project integrity and confidence while motivating others through team collaboration as well as recognising individual strengths, differences, and contributions.
- Understand disruptive trends and promote potential opportunities for improvement.
- You are someone with:A degree, preferably in technical engineering, statistics, economics, mathematics, finance, accountancy, or a related field.
- Possess a minimum of 5 years of relevant work experience. A background in banking or financial institutions is preferred, but this can be supplemented with significant knowledge of the financial markets and banking industry.
- Strong knowledge of risk management, with a focus on one of the risk domains namely credit risk, market risk, operational risk and climate risk preferred.
- Ability to work independently and collaboratively with a diverse range of staff on qualitative and quantitative risk management in multitasking and cross-country settings.
- Proficient in data analytics or statistical analysis tools (i.e., Python and SAS), with advanced Excel skills.
- Experience in mentoring and coaching at least 2-3 junior team members.
- Proficient in business-level English, with the ability to communicate ideas and prepare professional client presentations.
- Due to volume of applications, we regret only shortlisted candidates will be notified.
- Please note that Deloitte will never reach out to you directly via messaging platforms to offer you employment opportunities or request for money or your personal information. Kindly apply for roles that you are interested in via this official Deloitte website.Requisition ID: 105622In Thailand, the services are provided by Deloitte Touche Tohmatsu Jaiyos Co., Ltd. and other related entities in Thailand ("Deloitte in Thailand"), which are affiliates of Deloitte Southeast Asia Ltd. Deloitte Southeast Asia Ltd is a member firm of Deloitte Touche Tohmatsu Limited. Deloitte in Thailand, which is within the Deloitte Network, is the entity that is providing this Website.
Experience:
5 years required
Skills:
Budgeting, Finance, Compliance
Job type:
Full-time
Salary:
negotiable
- Owns the whole management reporting, which includes planning, budgeting, forecasting, and variance analysis processes with focus to develop efficiency in the process including the non-finance stakeholders (Operation, SCM, CFT, HR&hellip.) including:IFRS 15 reporting.
- US GAAP compliance.
- CF review & hedging.
- CPR reporting (productivity & continues improvement financial measurement).
- Flawless Lunch (project phase gate financial review).
- New quotations and rates.
- Management Dashboards and GM partnering.
- Lead complex cross-functional projects in area FP&A and tax as Finance department representative.
- Act as a project manager for finance-related projects, including tax compliance initiatives or tax model implementation (TP area).
- Assist in scenario analysis, assessing the financial impact of various business initiatives.
- Internal Controls & SOX.
- Manage and improve the company s forecasting tools and processes to ensure accurate and timely information.
- Simplify and automate FP&A processes to improve efficiency and accuracy to reduce manual efforts and increase process reliability.
- Document FP&A processes and document desktop procedures for knowledge sharing and clear guidelines available for all team members are in place. Further develop and optimize this documentation, assuring the application of best practices in compliance with company s internal control framework.
- Mapping of team competencies and update the RR matrix.
- Mentor and develop junior team FP&A team members.
- Bachelor s degree in Finance, Accounting, Economics, or a related field.
- 5+ years of experience in FP&A, Finance, Accounting or related financial roles.
- Strong analytical skills with a demonstrated ability to interpret data and provide actionable insights.
- Expertise in financial modeling, budgeting, forecasting, and variance analysis.
- Advanced proficiency in Excel and experience with financial software (Oracle Hyperion Financial Management, Long View ).
- Excellent communication and presentation skills, with the ability to interact effectively with senior leadership.
- Experience in process improvement and automation within finance functions, leveraging technology such as RPA (Robotic Process Automation) or financial software.
- Familiarity with tax-related finance projects such as transfer pricing, indirect taxes, or compliance.
- Project management skills and ability to work in a cross-functional team environment and manage multiple stakeholders.
- Celestica is an equal opportunity employer. All qualified applicants will receive consideration for employment and will not be discriminated against on any protected status (including race, religion, national origin, gender, sexual orientation, age, marital status, veteran or disability status or other characteristics protected by law).
- At Celestica we are committed to fostering an inclusive, accessible environment, where all employees and customers feel valued, respected and supported. Special arrangements can be made for candidates who need it throughout the hiring process. Please indicate your needs and we will work with you to meet them.
- Celestica (NYSE, TSX: CLS) enables the world s best brands. Through our recognized customer-centric approach, we partner with leading companies in Aerospace and Defense, Communications, Enterprise, HealthTech, Industrial, Capital Equipment and Energy to deliver solutions for their most complex challenges. As a leader in design, manufacturing, hardware platform and supply chain solutions, Celestica brings global expertise and insight at every stage of product development - from drawing board to full-scale production and after-market services for products from advanced medical devices, to highly engineered aviation systems, to next-generation hardware platform solutions for the Cloud. Headquartered in Toronto, with talented teams spanning 40+ locations in 13 countries across the Americas, Europe and Asia, we imagine, develop and deliver a better future with our customers.
- Celestica would like to thank all applicants, however, only qualified applicants will be contacted.
- Celestica does not accept unsolicited resumes from recruitment agencies or fee based recruitment services.
Experience:
8 years required
Skills:
Data Analysis, Tableau, Excel
Job type:
Full-time
Salary:
negotiable
- Work closely with stakeholders to understand business requirements, identify opportunities, and whitespace to commercialize data-driven insights and solutions.
- The candidate will need to lead data analysis and sales pitching for the Annual Media planning process, and have good commercial awareness and communication skills to be able to connect with multiple stakeholders.
- Break down business questions into analytical frameworks, and being able to talk the language of technical teams as well as commercial stakeholders is a key requirement ...
- The candidate will need to lead the development of end-to-end data lead media solutions, and media measurements to keep ahead of the media industry standards and own roadmap to deploy and modernize media measurements across different media platforms, channels, and mechanics.
- Candidates should have some idea of offline and online SSP and DSP platforms and architecture, and market direction to strategically build and improve media solutions, either owned or in partnership with external parties.
- Be proactive and co-own the go to market strategy along with commercial stakeholders for multiple media channels/ and other data commercialization initiatives, and proactively help plan the right focus areas for the team, develop solutions and products to help build the roadmap and pipeline for commercial opportunities.
- Develop and implement predictive models, statistical algorithms, and machine learning models to support business needs.
- Develop and implement data visualizations using PowerBI/ Data Studio/QuickSight/ Tableau/ Excel to effectively communicate insights to stakeholders.
- Collaborate with cross-functional teams, including business analysts, product managers, and developers, to implement data-driven solutions.
- Stay up to date with emerging technologies, marketing technology platforms, omni-channel media and industry trends to identify new opportunities for improving data analytics and applications.
- Mentor and coach junior data scientists and data analysts to develop their skills and expertise.
- Bachelor s or Master s degree in data science/ engineering/ statistics/economics/ computer science/ mathematics, or a related field is a requirement.
- MBA/Business degree with strong background in technical understanding and hands-on expertise is preferable.
- Online media experience will be a strong advantage.
- 8+ years of experience as a data scientist or data analyst in marketing across any industry is a requirement.
- Strong proficiency in Python/ Pyspark/ SQL/ R is a requirement.
- Strong experience in story telling from data, analysis and insight is required.
- Commercial understanding, and having a balanced approach for go-to-market strategy is a requirement.
- Experience with machine learning algorithms and statistical modeling is a bonus.
- Strong communication skills and the ability to collaborate with cross-functional teams is a requirement.
- Proven ability to work independently and manage multiple projects simultaneously is a requirement.
- The candidate must display a high sense of accountability and be agile in handling high value projects, and be able to motivate the team to deliver as a shared objective with the commercial plan.
- Experience in mentoring and coaching junior/ senior DA/ DS is a requirement.
- 1
- 2