- No elements found. Consider changing the search query.
Skills:
SQL, Oracle, Data Warehousing
Job type:
Full-time
Salary:
negotiable
- Bachelor s degree in Computer Science, Information Systems, Engineering, or a related field.
- At least 7 years of experience as a Data Engineer or in a related role.
- Hands-on experience with SQL, database management (e.g., Oracle, SQL Server, PostgreSQL), and data warehousing concepts.
- Experience with ETL/ELT tools such as Talend, Apache NiFi, or similar.
- Proficiency in programming languages like Python, Java, or Scala for data manipulation and automation.
- Experience with cloud platforms such as AWS, Azure, or GCP.
- Knowledge of big data technologies such as Hadoop, Spark, or Kafka.
- Strong understanding of data governance, security, and privacy frameworks in a financial services context.
- Excellent problem-solving skills and attention to detail.
- Experience working with Data Visualization or BI tools like Power BI, Tableau.
- Familiarity with machine learning concepts, model deployment, and AI applications.
- Banking or financial services industry experience, especially in retail or wholesale banking data solutions.
- Certification in cloud platforms (e.g., AWS Certified Data Engineer, Microsoft Azure Data Engineer, Google Professional Data Engineer)..
- Contact:.
- āļāđāļēāļāļŠāļēāļĄāļēāļĢāļāļāđāļēāļāđāļĨāļ°āļĻāļķāļāļĐāļēāļāđāļĒāļāļēāļĒāļāļ§āļēāļĄāđāļāđāļāļŠāđāļ§āļāļāļąāļ§āļāļāļāļāļāļēāļāļēāļĢāļāļĢāļļāļāđāļāļĒ āļāļģāļāļąāļ (āļĄāļŦāļēāļāļ) āļāļĩāđ https://krungthai.com/th/content/privacy-policy āļāļąāđāļāļāļĩāđ āļāļāļēāļāļēāļĢāđāļĄāđāļĄāļĩāđāļāļāļāļēāļŦāļĢāļ·āļāļāļ§āļēāļĄāļāļģāđāļāđāļāđāļāđ āļāļĩāđāļāļ°āļāļĢāļ°āļĄāļ§āļĨāļāļĨāļāđāļāļĄāļđāļĨāļŠāđāļ§āļāļāļļāļāļāļĨāļāļĩāđāļĄāļĩāļāļ§āļēāļĄāļāđāļāļāđāļŦāļ§ āļĢāļ§āļĄāļāļķāļāļāđāļāļĄāļđāļĨāļāļĩāđāđāļāļĩāđāļĒāļ§āļāđāļāļāļĻāļēāļŠāļāļēāđāļĨāļ°/āļŦāļĢāļ·āļāļŦāļĄāļđāđāđāļĨāļŦāļīāļ āļāļķāđāļāļāļēāļāļāļĢāļēāļāļāļāļĒāļđāđāđāļāļŠāļģāđāļāļēāļāļąāļāļĢāļāļĢāļ°āļāļģāļāļąāļ§āļāļĢāļ°āļāļēāļāļāļāļāļāļāđāļēāļāđāļāđāļāļĒāđāļēāļāđāļ āļāļąāļāļāļąāđāļ āļāļĢāļļāļāļēāļāļĒāđāļēāļāļąāļāđāļŦāļĨāļāđāļāļāļŠāļēāļĢāđāļāđ āļĢāļ§āļĄāļāļķāļāļŠāļģāđāļāļēāļāļąāļāļĢāļāļĢāļ°āļāļģāļāļąāļ§āļāļĢāļ°āļāļēāļāļ āļŦāļĢāļ·āļāļāļĢāļāļāļāđāļāļĄāļđāļĨāļŠāđāļ§āļāļāļļāļāļāļĨāļāļĩāđāļĄāļĩāļāļ§āļēāļĄāļāđāļāļāđāļŦāļ§āļŦāļĢāļ·āļāļāđāļāļĄāļđāļĨāļāļ·āđāļāđāļ āļāļķāđāļāđāļĄāđāđāļāļĩāđāļĒāļ§āļāđāļāļāļŦāļĢāļ·āļāđāļĄāđāļāļģāđāļāđāļāļŠāļģāļŦāļĢāļąāļāļ§āļąāļāļāļļāļāļĢāļ°āļŠāļāļāđāđāļāļāļēāļĢāļŠāļĄāļąāļāļĢāļāļēāļāđāļ§āđāļāļāđāļ§āđāļāđāļāļāđ āļāļāļāļāļēāļāļāļĩāđ āļāļĢāļļāļāļēāļāļģāđāļāļīāļāļāļēāļĢāđāļŦāđāđāļāđāđāļāļ§āđāļēāđāļāđāļāļģāđāļāļīāļāļāļēāļĢāļĨāļāļāđāļāļĄāļđāļĨāļŠāđāļ§āļāļāļļāļāļāļĨāļāļĩāđāļĄāļĩāļāļ§āļēāļĄāļāđāļāļāđāļŦāļ§ (āļāđāļēāļĄāļĩ) āļāļāļāļāļēāļāđāļĢāļāļđāđāļĄāđāđāļĨāļ°āđāļāļāļŠāļēāļĢāļāļ·āđāļāđāļāļāđāļāļāļāļĩāđāļāļ°āļāļąāļāđāļŦāļĨāļāđāļāļāļŠāļēāļĢāļāļąāļāļāļĨāđāļēāļ§āđāļ§āđāļāļāđāļ§āđāļāđāļāļāđāđāļĨāđāļ§āļāđāļ§āļĒ āļāļąāđāļāļāļĩāđ āļāļāļēāļāļēāļĢāļĄāļĩāļāļ§āļēāļĄāļāļģāđāļāđāļāļāđāļāļāđāļāđāļāļĢāļ§āļāļĢāļ§āļĄāļāđāļāļĄāļđāļĨāļŠāđāļ§āļāļāļļāļāļāļĨāđāļāļĩāđāļĒāļ§āļāļąāļāļāļĢāļ°āļ§āļąāļāļīāļāļēāļāļāļēāļāļĢāļĢāļĄāļāļāļāļāđāļēāļāđāļāļ·āđāļāļāļĢāļĢāļĨāļļāļ§āļąāļāļāļļāļāļĢāļ°āļŠāļāļāđāđāļāļāļēāļĢāļāļīāļāļēāļĢāļāļēāļĢāļąāļāļāļļāļāļāļĨāđāļāđāļēāļāļģāļāļēāļ āļŦāļĢāļ·āļāļāļēāļĢāļāļĢāļ§āļāļŠāļāļāļāļļāļāļŠāļĄāļāļąāļāļī āļĨāļąāļāļĐāļāļ°āļāđāļāļāļŦāđāļēāļĄ āļŦāļĢāļ·āļāļāļīāļāļēāļĢāļāļēāļāļ§āļēāļĄāđāļŦāļĄāļēāļ°āļŠāļĄāļāļāļāļāļļāļāļāļĨāļāļĩāđāļāļ°āđāļŦāđāļāļģāļĢāļāļāļģāđāļŦāļāđāļ āļāļķāđāļāļāļēāļĢāđāļŦāđāļāļ§āļēāļĄāļĒāļīāļāļĒāļāļĄāđāļāļ·āđāļāđāļāđāļāļĢāļ§āļāļĢāļ§āļĄ āđāļāđ āļŦāļĢāļ·āļāđāļāļīāļāđāļāļĒāļāđāļāļĄāļđāļĨāļŠāđāļ§āļāļāļļāļāļāļĨāđāļāļĩāđāļĒāļ§āļāļąāļāļāļĢāļ°āļ§āļąāļāļīāļāļēāļāļāļēāļāļĢāļĢāļĄāļāļāļāļāđāļēāļāļĄāļĩāļāļ§āļēāļĄāļāļģāđāļāđāļāļŠāļģāļŦāļĢāļąāļāļāļēāļĢāđāļāđāļēāļāļģāļŠāļąāļāļāļēāđāļĨāļ°āļāļēāļĢāđāļāđāļĢāļąāļāļāļēāļĢāļāļīāļāļēāļĢāļāļēāļāļēāļĄāļ§āļąāļāļāļļāļāļĢāļ°āļŠāļāļāđāļāļąāļāļāļĨāđāļēāļ§āļāđāļēāļāļāđāļ āđāļāļāļĢāļāļĩāļāļĩāđāļāđāļēāļāđāļĄāđāđāļŦāđāļāļ§āļēāļĄāļĒāļīāļāļĒāļāļĄāđāļāļāļēāļĢāđāļāđāļāļĢāļ§āļāļĢāļ§āļĄ āđāļāđ āļŦāļĢāļ·āļāđāļāļīāļāđāļāļĒāļāđāļāļĄāļđāļĨāļŠāđāļ§āļāļāļļāļāļāļĨāđāļāļĩāđāļĒāļ§āļāļąāļāļāļĢāļ°āļ§āļąāļāļīāļāļēāļāļāļēāļāļĢāļĢāļĄ āļŦāļĢāļ·āļāļĄāļĩāļāļēāļĢāļāļāļāļāļ§āļēāļĄāļĒāļīāļāļĒāļāļĄāđāļāļ āļēāļĒāļŦāļĨāļąāļ āļāļāļēāļāļēāļĢāļāļēāļāđāļĄāđāļŠāļēāļĄāļēāļĢāļāļāļģāđāļāļīāļāļāļēāļĢāđāļāļ·āđāļāļāļĢāļĢāļĨāļļāļ§āļąāļāļāļļāļāļĢāļ°āļŠāļāļāđāļāļąāļāļāļĨāđāļēāļ§āļāđāļēāļāļāđāļāđāļāđ āđāļĨāļ°āļāļēāļ āļāļģāđāļŦāđāļāđāļēāļāļŠāļđāļāđāļŠāļĩāļĒāđāļāļāļēāļŠāđāļāļāļēāļĢāđāļāđāļĢāļąāļāļāļēāļĢāļāļīāļāļēāļĢāļāļēāļĢāļąāļāđāļāđāļēāļāļģāļāļēāļāļāļąāļāļāļāļēāļāļēāļĢ .
Experience:
3 years required
Skills:
Big Data, Hive, SAS
Job type:
Full-time
Salary:
negotiable
- Design, implement, and maintain data analytics pipelines and processing systems.
- Experience of data modelling techniques and integration patterns.
- Write data transformation jobs through code.
- Analyze large datasets to extract insights and identify trends.
- Perform data management through data quality tests, monitoring, cataloging, and governance.
- Knowledge in data infrastructure ecosystem.
- Collaborate with cross-functional teams to identify opportunities to leverage data to drive business outcomes.
- Build data visualizations to communicate findings to stakeholders.
- A willingness to learn and find solutions to complex problems.
- Stay up-to-date with the latest developments in data analytics and science.
- Experience migrating from on-premise data stores to cloud solutions.
- Knowledge of system design and platform thinking to build sustainable solution.
- Practical experience with modern and traditional Big Data stacks (e.g BigQuery, Spark, Databricks, duckDB, Impala, Hive, etc).
- Experience working with data warehouse solutions ELT solutions, tools, and techniques (e.g. Airflow, dbt, SAS, Matillion, Nifi).
- Experience with agile software delivery and CI/CD processes.
- Bachelor's or Master's degree in computer science, statistics, engineering, or a related field.
- At least 3 years of experience in data analysis and modeling.
- Proficiency in Python, and SQL.
- Experience with data visualization tools such as Tableau, Grafana or similar.
- Familiarity with cloud computing platforms, such as GCP, AWS or Databricks.
- Strong problem-solving skills and the ability to work independently as well as collaboratively.
- This role offers a clear path to advance into machine learning and AI with data quality and management, providing opportunities to work on innovative projects and develop new skills in these exciting fields..
- Contact: [email protected] (K.Thipwimon).
- āļāđāļēāļāļŠāļēāļĄāļēāļĢāļāļāđāļēāļāđāļĨāļ°āļĻāļķāļāļĐāļēāļāđāļĒāļāļēāļĒāļāļ§āļēāļĄāđāļāđāļāļŠāđāļ§āļāļāļąāļ§āļāļāļāļāļāļēāļāļēāļĢāļāļĢāļļāļāđāļāļĒ āļāļģāļāļąāļ (āļĄāļŦāļēāļāļ) āļāļĩāđ https://krungthai.com/th/content/privacy-policy āļāļąāđāļāļāļĩāđ āļāļāļēāļāļēāļĢāđāļĄāđāļĄāļĩāđāļāļāļāļēāļŦāļĢāļ·āļāļāļ§āļēāļĄāļāļģāđāļāđāļāđāļāđ āļāļĩāđāļāļ°āļāļĢāļ°āļĄāļ§āļĨāļāļĨāļāđāļāļĄāļđāļĨāļŠāđāļ§āļāļāļļāļāļāļĨāļāļĩāđāļĄāļĩāļāļ§āļēāļĄāļāđāļāļāđāļŦāļ§ āļĢāļ§āļĄāļāļķāļāļāđāļāļĄāļđāļĨāļāļĩāđāđāļāļĩāđāļĒāļ§āļāđāļāļāļĻāļēāļŠāļāļēāđāļĨāļ°/āļŦāļĢāļ·āļāļŦāļĄāļđāđāđāļĨāļŦāļīāļ āļāļķāđāļāļāļēāļāļāļĢāļēāļāļāļāļĒāļđāđāđāļāļŠāļģāđāļāļēāļāļąāļāļĢāļāļĢāļ°āļāļģāļāļąāļ§āļāļĢāļ°āļāļēāļāļāļāļāļāļāđāļēāļāđāļāđāļāļĒāđāļēāļāđāļ āļāļąāļāļāļąāđāļ āļāļĢāļļāļāļēāļāļĒāđāļēāļāļąāļāđāļŦāļĨāļāđāļāļāļŠāļēāļĢāđāļāđ āļĢāļ§āļĄāļāļķāļāļŠāļģāđāļāļēāļāļąāļāļĢāļāļĢāļ°āļāļģāļāļąāļ§āļāļĢāļ°āļāļēāļāļ āļŦāļĢāļ·āļāļāļĢāļāļāļāđāļāļĄāļđāļĨāļŠāđāļ§āļāļāļļāļāļāļĨāļāļĩāđāļĄāļĩāļāļ§āļēāļĄāļāđāļāļāđāļŦāļ§āļŦāļĢāļ·āļāļāđāļāļĄāļđāļĨāļāļ·āđāļāđāļ āļāļķāđāļāđāļĄāđāđāļāļĩāđāļĒāļ§āļāđāļāļāļŦāļĢāļ·āļāđāļĄāđāļāļģāđāļāđāļāļŠāļģāļŦāļĢāļąāļāļ§āļąāļāļāļļāļāļĢāļ°āļŠāļāļāđāđāļāļāļēāļĢāļŠāļĄāļąāļāļĢāļāļēāļāđāļ§āđāļāļāđāļ§āđāļāđāļāļāđ āļāļāļāļāļēāļāļāļĩāđ āļāļĢāļļāļāļēāļāļģāđāļāļīāļāļāļēāļĢāđāļŦāđāđāļāđāđāļāļ§āđāļēāđāļāđāļāļģāđāļāļīāļāļāļēāļĢāļĨāļāļāđāļāļĄāļđāļĨāļŠāđāļ§āļāļāļļāļāļāļĨāļāļĩāđāļĄāļĩāļāļ§āļēāļĄāļāđāļāļāđāļŦāļ§ (āļāđāļēāļĄāļĩ) āļāļāļāļāļēāļāđāļĢāļāļđāđāļĄāđāđāļĨāļ°āđāļāļāļŠāļēāļĢāļāļ·āđāļāđāļāļāđāļāļāļāļĩāđāļāļ°āļāļąāļāđāļŦāļĨāļāđāļāļāļŠāļēāļĢāļāļąāļāļāļĨāđāļēāļ§āđāļ§āđāļāļāđāļ§āđāļāđāļāļāđāđāļĨāđāļ§āļāđāļ§āļĒ āļāļąāđāļāļāļĩāđ āļāļāļēāļāļēāļĢāļĄāļĩāļāļ§āļēāļĄāļāļģāđāļāđāļāļāđāļāļāđāļāđāļāļĢāļ§āļāļĢāļ§āļĄāļāđāļāļĄāļđāļĨāļŠāđāļ§āļāļāļļāļāļāļĨāđāļāļĩāđāļĒāļ§āļāļąāļāļāļĢāļ°āļ§āļąāļāļīāļāļēāļāļāļēāļāļĢāļĢāļĄāļāļāļāļāđāļēāļāđāļāļ·āđāļāļāļĢāļĢāļĨāļļāļ§āļąāļāļāļļāļāļĢāļ°āļŠāļāļāđāđāļāļāļēāļĢāļāļīāļāļēāļĢāļāļēāļĢāļąāļāļāļļāļāļāļĨāđāļāđāļēāļāļģāļāļēāļ āļŦāļĢāļ·āļāļāļēāļĢāļāļĢāļ§āļāļŠāļāļāļāļļāļāļŠāļĄāļāļąāļāļī āļĨāļąāļāļĐāļāļ°āļāđāļāļāļŦāđāļēāļĄ āļŦāļĢāļ·āļāļāļīāļāļēāļĢāļāļēāļāļ§āļēāļĄāđāļŦāļĄāļēāļ°āļŠāļĄāļāļāļāļāļļāļāļāļĨāļāļĩāđāļāļ°āđāļŦāđāļāļģāļĢāļāļāļģāđāļŦāļāđāļ āļāļķāđāļāļāļēāļĢāđāļŦāđāļāļ§āļēāļĄāļĒāļīāļāļĒāļāļĄāđāļāļ·āđāļāđāļāđāļāļĢāļ§āļāļĢāļ§āļĄ āđāļāđ āļŦāļĢāļ·āļāđāļāļīāļāđāļāļĒāļāđāļāļĄāļđāļĨāļŠāđāļ§āļāļāļļāļāļāļĨāđāļāļĩāđāļĒāļ§āļāļąāļāļāļĢāļ°āļ§āļąāļāļīāļāļēāļāļāļēāļāļĢāļĢāļĄāļāļāļāļāđāļēāļāļĄāļĩāļāļ§āļēāļĄāļāļģāđāļāđāļāļŠāļģāļŦāļĢāļąāļāļāļēāļĢāđāļāđāļēāļāļģāļŠāļąāļāļāļēāđāļĨāļ°āļāļēāļĢāđāļāđāļĢāļąāļāļāļēāļĢāļāļīāļāļēāļĢāļāļēāļāļēāļĄāļ§āļąāļāļāļļāļāļĢāļ°āļŠāļāļāđāļāļąāļāļāļĨāđāļēāļ§āļāđāļēāļāļāđāļ āđāļāļāļĢāļāļĩāļāļĩāđāļāđāļēāļāđāļĄāđāđāļŦāđāļāļ§āļēāļĄāļĒāļīāļāļĒāļāļĄāđāļāļāļēāļĢāđāļāđāļāļĢāļ§āļāļĢāļ§āļĄ āđāļāđ āļŦāļĢāļ·āļāđāļāļīāļāđāļāļĒāļāđāļāļĄāļđāļĨāļŠāđāļ§āļāļāļļāļāļāļĨāđāļāļĩāđāļĒāļ§āļāļąāļāļāļĢāļ°āļ§āļąāļāļīāļāļēāļāļāļēāļāļĢāļĢāļĄ āļŦāļĢāļ·āļāļĄāļĩāļāļēāļĢāļāļāļāļāļ§āļēāļĄāļĒāļīāļāļĒāļāļĄāđāļāļ āļēāļĒāļŦāļĨāļąāļ āļāļāļēāļāļēāļĢāļāļēāļāđāļĄāđāļŠāļēāļĄāļēāļĢāļāļāļģāđāļāļīāļāļāļēāļĢāđāļāļ·āđāļāļāļĢāļĢāļĨāļļāļ§āļąāļāļāļļāļāļĢāļ°āļŠāļāļāđāļāļąāļāļāļĨāđāļēāļ§āļāđāļēāļāļāđāļāđāļāđ āđāļĨāļ°āļāļēāļ āļāļģāđāļŦāđāļāđāļēāļāļŠāļđāļāđāļŠāļĩāļĒāđāļāļāļēāļŠāđāļāļāļēāļĢāđāļāđāļĢāļąāļāļāļēāļĢāļāļīāļāļēāļĢāļāļēāļĢāļąāļāđāļāđāļēāļāļģāļāļēāļāļāļąāļāļāļāļēāļāļēāļĢ .
Experience:
3 years required
Skills:
Microsoft Azure, SQL, UNIX, Python, Hadoop
Job type:
Full-time
Salary:
negotiable
- Develop data pipeline automation using Azure technologies, Databricks and Data Factory.
- Understand data, reports and dashboards requirements, develop data visualization using Power BI, Tableau by working across workstreams to support data requirements including reports and dashboards and collaborate with data scientists, data analyst, data governance team or business stakeholders on several projects.
- Analyze and perform data profiling to understand data patterns following Data Qualit ...
- 3 years+ experience in big data technology, data engineering, data analytic application system development.
- Have an experience of unstructured data for business Intelligence or computer science would be advantage.
- Technical skill in SQL, UNIX and Shell Script, Python, R Programming, Spark, Hadoop programming.
Experience:
No experience required
Skills:
Mechanical Engineering, Electrical Engineering, English
Job type:
Full-time
- Provide day to day installation, maintenance, and repair of all facilities in the data center.
- 24x7 shift work responsibility when qualified and designated.
- Provide requested reporting and documentation.
- Support of facility, development, and construction teams.
- Perform tasks as assigned by DC operation manager.
- Respond to customer requests, power, cooling, and facility audits.
- First tier investigate any power, communication, or cooling anomalies.
- Attend assigned meetings and training.
- Assist in ensuring customer compliance with GSA Acceptance Usage Policy (AUP).
- Provide technical escort when needed.
- Job Qualifications.
- Must be familiar with safety requirements and OSHA regulations or Thailand safety regulations.
- Basic understanding of electrical and mechanical systems that may be employed in a data center environment. This may include electrical feeders, transformers, generators, switchgear, UPS systems, DC power systems, ATS/STS units, PDU units, air handling units, cooling towers, and fire suppression systems.
- Familiar with Interpret wiring diagrams, schematics, and electrical drawings.
- Ability to express ideas clearly, concisely, and effectively with contractors performing maintenance or upgrades on systems installed in the data center environment.
- Excellent verbal, written, and interpersonal communication skills.
- Ability to analyze and make suggestions for problem resolution.
- Solve problems with good initiative and sound judgment.
- Creativity, problem solving skills, negotiation and systematic thinking.
- Fluent in English both written and verbal (Minimum 500 TOEIC score).
- Goal-Oriented, Unity, Learning, Flexible.
Skills:
Database Administration
Job type:
Full-time
Salary:
negotiable
- āļāļģāļāļēāļĢāļāļģāđāļāļīāļāļāļēāļāļāđāļēāļāļāļĢāļāļāļāļēāļĢāļāļģāļāļąāļāļāļđāđāļĨāļāđāļāļĄāļđāļĨ āļāđāļĒāļāļēāļĒ āđāļĨāļ°āļāļĢāļ°āļāļ§āļāļāļēāļĢāļāļĩāđāļāļģāļŦāļāļāđāļāļĒāļāļāļāđāļāļĢ.
- āļāļģāļāļēāļāļĢāđāļ§āļĄāļāļąāļāđāļāđāļēāļāļāļāļāđāļāļĄāļđāļĨāđāļĨāļ°āļāļđāđāļāļđāđāļĨāļāđāļāļĄāļđāļĨāđāļāļ·āđāļāļāļģāļŦāļāļāļĄāļēāļāļĢāļāļēāļāļāđāļāļĄāļđāļĨ āđāļāļ§āļāļāļīāļāļąāļāļīāđāļāļāļēāļĢāļāļąāļāļāļēāļĢāļāđāļāļĄāļđāļĨ āđāļĨāļ°āđāļāļ§āļāļēāļāļāļēāļĢāđāļāđāļāļēāļāļāđāļāļĄāļđāļĨ.
- āļāļĢāļ§āļāļŠāļāļāđāļŦāđāļŠāļāļāļāļĨāđāļāļāļāļąāļāļāđāļāļāļģāļŦāļāļāļāđāļēāļāļāļāļŦāļĄāļēāļĒāđāļĨāļ°āļĄāļēāļāļĢāļāļēāļāļāļĩāđāđāļāļĩāđāļĒāļ§āļāđāļāļ (āđāļāđāļ GDPR, PDPA).
- āļāļēāļĢāļāļąāļāļāļēāļĢāļāļļāļāļ āļēāļāļāļāļāļāđāļāļĄāļđāļĨ (Data Quality Management)āļāļģāļŦāļāļāđāļĨāļ°āļāļīāļāļāļēāļĄāļāļąāļ§āļāļĩāđāļ§āļąāļāļāļļāļāļ āļēāļāļāđāļāļĄāļđāļĨ (āđāļāđāļ āļāļ§āļēāļĄāļāļđāļāļāđāļāļ āļāļ§āļēāļĄāļāļĢāļāļāđāļ§āļ āļāļ§āļēāļĄāļāļąāļāđāļ§āļĨāļē).
- āđāļāđāđāļāļĢāļ·āđāļāļāļĄāļ·āļāđāļĨāļ°āļāļĢāļ°āļāļ§āļāļāļēāļĢāļāļąāļāļāļēāļĢāļāļļāļāļ āļēāļāļāđāļāļĄāļđāļĨāđāļāļ·āđāļāđāļāđāđāļāļāļ§āļēāļĄāļāļīāļāļāļĨāļēāļāđāļĨāļ°āļāļąāļāļāļēāļāļ§āļēāļĄāļāđāļēāđāļāļ·āđāļāļāļ·āļāļāļāļāļāđāļāļĄāļđāļĨ.
- āļāļĢāļ°āļŠāļēāļāļāļēāļāļāļąāļāļŦāļāđāļ§āļĒāļāļēāļāļāļļāļĢāļāļīāļāđāļāļ·āđāļāđāļāđāļāļąāļāļŦāļēāđāļĨāļ°āļāđāļāļāļāļąāļāļāļąāļāļŦāļēāļāļļāļāļ āļēāļāļāđāļāļĄāļđāļĨāđāļāļāļāļēāļāļ.
- āļāļēāļĢāļāļąāļāļāļēāļĢāđāļĄāļāļēāļāļēāļāđāļēāđāļĨāļ°āļāđāļāļĄāļđāļĨāļŦāļĨāļąāļ (Metadata and Master Data Management)āļāļąāļāļāļēāđāļĨāļ°āļāļđāđāļĨāļāļĨāļąāļāđāļĄāļāļēāļāļēāļāđāļēāđāļĨāļ°āļāļāļāļēāļāļļāļāļĢāļĄāļāđāļāļĄāļđāļĨāļāļāļāļāļāļāđāļāļĢ.
- āļāļđāđāļĨāļāļēāļĢāļāļąāļāļāļēāļĢāļāđāļāļĄāļđāļĨāļŦāļĨāļąāļ (Master Data) āđāļāļ·āđāļāđāļŦāđāļāđāļāļĄāļđāļĨāļĄāļĩāļāļ§āļēāļĄāļŠāļāļāļāļĨāđāļāļāđāļāļĢāļ°āļāļāđāļĨāļ°āļāļĢāļ°āļāļ§āļāļāļēāļĢ.
- āļāļēāļĢāļāļģāļāļēāļāļĢāđāļ§āļĄāļāļąāļāļāļđāđāļĄāļĩāļŠāđāļ§āļāđāļāđāļŠāđāļ§āļāđāļŠāļĩāļĒāđāļāđāļāļāļąāļ§āļāļĨāļēāļāļĢāļ°āļŦāļ§āđāļēāļāļŦāļāđāļ§āļĒāļāļēāļāļāļļāļĢāļāļīāļ āļāļĩāļĄāđāļāļāļĩ āđāļĨāļ°āļāļĩāļĄāļāļģāļāļąāļāļāļđāđāļĨ āđāļāļ·āđāļāļŠāļĢāđāļēāļāļ§āļąāļāļāļāļĢāļĢāļĄāļāļ§āļēāļĄāļĢāļąāļāļāļīāļāļāļāļāļāđāļāļāđāļāļĄāļđāļĨ.
- āđāļāđāļāļāļđāđāļāļģāđāļāļāļēāļĢāļāļąāļāļāļĢāļ°āļāļļāļĄāļāļāļ°āļāļĢāļĢāļĄāļāļēāļĢāļāļģāļāļąāļāļāļđāđāļĨāļāđāļāļĄāļđāļĨāđāļĨāļ°āļāļĨāļļāđāļĄāļāļģāļāļēāļ.
- āļāļąāļāļāļāļĢāļĄāđāļĨāļ°āđāļŦāđāļāļēāļĢāļŠāļāļąāļāļŠāļāļļāļāđāļāđāļāļđāđāļĄāļĩāļŠāđāļ§āļāđāļāđāļŠāđāļ§āļāđāļŠāļĩāļĒ āđāļāļ·āđāļāļĒāļāļĢāļ°āļāļąāļāļāļ§āļēāļĄāđāļāđāļēāđāļāđāļāļāđāļāļĄāļđāļĨāđāļĨāļ°āļāļāļīāļāļąāļāļīāļāļēāļĄāđāļāļ§āļāļēāļāļāļēāļĢāļāļģāļāļąāļāļāļđāđāļĨ.
- āļāļēāļĢāļāļĢāļīāļŦāļēāļĢāļāļąāļāļāļēāļĢāļāļ§āļēāļĄāđāļŠāļĩāđāļĒāļāđāļĨāļ°āļāļēāļĢāļāļāļīāļāļąāļāļīāļāļēāļĄāļāļāļĢāļ°āđāļāļĩāļĒāļāļĢāļ°āļāļļāļāļ§āļēāļĄāđāļŠāļĩāđāļĒāļāļāļĩāđāđāļāļĩāđāļĒāļ§āļāđāļāļāļāļąāļāļāđāļāļĄāļđāļĨāđāļĨāļ°āđāļŠāļāļāđāļāļ§āļāļēāļāļāļēāļĢāđāļāđāđāļ.
- āļāļĢāļ§āļāļŠāļāļāđāļŦāđāļāļēāļĢāđāļāđāļāļēāļāļāđāļāļĄāļđāļĨāļŠāļāļāļāļĨāđāļāļāļāļąāļāđāļāđāļēāļŦāļĄāļēāļĒāļāļāļāļāļāļāđāļāļĢ āļāļāļŦāļĄāļēāļĒ āđāļĨāļ°āļĄāļēāļāļĢāļāļēāļāļāļēāļāļāļĢāļīāļĒāļāļĢāļĢāļĄ.
- āđāļāđāļāļāļđāđāļāļģāđāļāļāļēāļĢāļāļĢāļ§āļāļŠāļāļāđāļĨāļ°āļāļĢāļ°āđāļĄāļīāļāļāļĨāļāđāļēāļāļāļēāļĢāļāļģāļāļąāļāļāļđāđāļĨāļāđāļāļĄāļđāļĨ..
- āļĄāļĩāļāļĢāļ°āļŠāļāļāļēāļĢāļāđāļāļģāļāļēāļāļāļĩāđāđāļāļĩāđāļĒāļ§āļāđāļāļāļāļąāļāļāļēāļĢāļāļģāļāļąāļāļāļđāđāļĨāđāļĨāļ°āļāļēāļĢāļāļąāļāļāļēāļĢāļāļļāļāļ āļēāļāļāļāļāļāđāļāļĄāļđāļĨ.
- āļāļĢāļīāļāļāļēāļāļĢāļĩāļŠāļēāļāļēāļāļĢāļīāļŦāļēāļĢāļāļļāļĢāļāļīāļ āļ§āļīāļāļĒāļēāļāļēāļĢāļāļāļĄāļāļīāļ§āđāļāļāļĢāđ āļāļāļĄāļāļīāļ§āđāļāļāļĢāđāļāļļāļĢāļāļīāļ āđāļāļāđāļāđāļĨāļĒāļĩāļŠāļēāļĢāļŠāļāđāļāļĻ āļŦāļĢāļ·āļ āļŠāļēāļāļēāļāļ·āđāļ āđ āļāļĩāđāđāļāļĩāđāļĒāļ§āļāđāļāļ.
- āļāļīāļāļāđāļāļŠāļāļāļāļēāļĄ.
- āļŠāļģāļāļąāļāļāļĢāļąāļāļĒāļēāļāļĢāļāļļāļāļāļĨ
- āļāļĢāļīāļĐāļąāļ āđāļāļĒāđāļāļāđāļ§āļāđāļĢāļ āļāļģāļāļąāļ (āļĄāļŦāļēāļāļ)
- āļāļēāļāļēāļĢāđāļĨāđāļēāđāļāđāļāļāđāļ§āļ 1 333 āļāļāļ āļ§āļīāļ āļēāļ§āļāļĩāļĢāļąāļāļŠāļīāļ āļāļāļĄāļāļĨ āđāļāļāļāļāļļāļāļąāļāļĢ āļāļĢāļļāļāđāļāļāļĄāļŦāļēāļāļāļĢ 10900.
Skills:
Industry trends, Statistics, Python
Job type:
Full-time
Salary:
negotiable
- Develop and execute a forward-thinking analytics strategy tailored to the retail industry, focusing on leveraging data platforms to drive revenue growth, operational efficiency, and customer satisfaction.
- Lead, mentor, and inspire a team of data scientists and analysts, fostering a culture of innovation, collaboration, and data-driven decision-making.
- Stay ahead of industry trends, emerging technologies, and best practices in data science and retail analytics to maintain CP Axtra s competitive edge.
- Analytics Execution.
- Oversee the integration of diverse data sources, including POS systems, CRM platforms, online transactions, and third-party providers, into our cloud-based data platform.
- Design and develop advanced machine learning models, algorithms, and statistical analyses to uncover actionable insights related to customer behavior, product performance, and market trends.
- Apply expertise in recommendation and personalization algorithms to enhance customer experiences and engagement.
- Deliver data-driven solutions to optimize pricing strategies, inventory management, and promotional campaigns, leveraging state-of-the-art analytics tools and methodologies.
- Business Partnership.
- Partner closely with retail operations, marketing, and sales teams to understand business challenges and provide tailored analytical support that aligns with strategic objectives.
- Identify opportunities to enhance customer segmentation, personalized marketing efforts, and customer retention strategies through advanced data science techniques.
- Act as a key advisor to senior leadership, translating complex data insights into actionable recommendations and business value.
- Performance Monitoring and Optimization.
- Define and monitor key performance indicators (KPIs) related to retail operations, such as sales conversion rates, customer lifetime value, and basket analysis.
- Leverage analytics to continuously assess and optimize business processes, driving operational efficiency and profitability.
- Communication and Presentation.
- Present complex analytical findings, models, and recommendations to stakeholders in a clear, impactful, and visually compelling manner.
- Collaborate across departments to implement data-driven initiatives that align with CPaxtra s goals and drive tangible outcomes.
- Education and Experience.
- Bachelor s degree in Statistics, Mathematics, Computer Science, Data Science, Economics, or a related field (Master s or PhD strongly preferred).
- Extensive experience in analytics, data science, or business intelligence roles, with significant exposure to the retail industry.
- Technical Skills.
- Advanced proficiency in Python, R, SQL, and machine learning frameworks.
- Expertise in data visualization tools (e.g., Tableau, Power BI) and cloud-based data platforms (e.g., AWS, GCP, Azure).
- In-depth knowledge of big data technologies (e.g., Spark, Hadoop) and modern data engineering practices.
- Strong understanding of recommendation/personalization algorithms and data processing technologies.
- Leadership and Business Acumen.
- Proven ability to lead high-performing teams in a dynamic, fast-paced environment.
- Exceptional strategic thinking and problem-solving skills with a demonstrated focus on delivering business value.
- Deep understanding of retail operations, including inventory management, customer journey mapping, and merchandising strategies.
- CP AXTRA | Lotus's
- CP AXTRA Public Company Limited.
- Nawamin Office: Buengkum, Bangkok 10230, Thailand.
- By applying for this position, you consent to the collection, use and disclosure of your personal data to us, our recruitment firms and all relevant third parties for the purpose of processing your application for this job position (or any other suitable positions within Lotus's and its subsidiaries, if any). You understand and acknowledge that your personal data will be processed in accordance with the law and our policy. .
Skills:
Finance, Financial Analysis
Job type:
Full-time
Salary:
negotiable
- This vacancy is to support new business expansion.
- Active Finance Business Partner (FP&A) in developing property investment strategy and execution: Mixed use project.
- Engage with senior management to understand the wider market trend and external factors which affect the investment.
- Lead and present financial feasibility and valuation of medium to large scale property projects to maximize return on investment.
- Be able to challenge key stakeholders for associated capex and opex investment in details.
- Perform post investment appraisal and provide insights and recommendation for improvement.
- Own the business planning cycle (budget, forecast, long term plan), understand key business drivers, risk and opportunities.
- Lead the continuous improvement of financial process and reporting and be able to leverage relevant technology and tool at work.
- Coach team and drive team effectiveness.
- Bachelor's degree or higher in business administration, finance, engineering, real estate.
- At least 5 years financial evaluation experience in mid to large scale property development.
- 7 year + finance experience in the real estate company/ mixed use project.
- Experience working with senior business stakeholders.
- Feasibility study and financial analysis skills.
- Real Estate Business acumen.
- Stakeholder management and Influencing skills.
- Strategic thinking and financial analysis skills.
- Good communication and presentation skill.
- Effective team management.
- CP AXTRA | Lotus's
- CP AXTRA Public Company Limited.
- Nawamin Office: Buengkum, Bangkok 10230, Thailand.
- By applying for this position, you consent to the collection, use and disclosure of your personal data to us, our recruitment firms and all relevant third parties for the purpose of processing your application for this job position (or any other suitable positions within Lotus's and its subsidiaries, if any). You understand and acknowledge that your personal data will be processed in accordance with the law and our policy.
Skills:
Power BI, Excel, Problem Solving
Job type:
Full-time
Salary:
negotiable
- Following critical path, ensuring all activities meet the required deadlines.
- Transforming data into business insights.
- Lead analytical task by utilizing data analytical and Power BI skill.
- Coordinate cross-functional team (Commercial/Store operation) by convincing with data and reporting.
- Support and conduct meeting with Commercial senior leadership team to accomplish project and related task.
- Other assignments as it deems appropriate.
- Bachelor Degree or above in IT, IT Engineering, Logistics, Business Data, Marketing, Business Administration or related field.
- Experience of retail or supplier supply chain, or distribution operations.
- Background of drawing Planogram is a big plus.
- Good Computer skills, especially on MS Excel.
- Product knowledge (preferable).
- Cross-functional agility, and the ability to lead and meet objectives in a fast-paced, rapidly changing environment.
- Strong logical thinking, visual design, and presentation skills with exceptional attention to detail.
- Good analytical & problem solving skills, planning skills, numerical skills.
- Good attitude and self-motivated.
- CP AXTRA | Lotus's
- CP AXTRA Public Company Limited.
- Nawamin Office: Buengkum, Bangkok 10230, Thailand.
- By applying for this position, you consent to the collection, use and disclosure of your personal data to us, our recruitment firms and all relevant third parties for the purpose of processing your application for this job position (or any other suitable positions within Lotus's and its subsidiaries, if any). You understand and acknowledge that your personal data will be processed in accordance with the law and our policy. .
Experience:
5 years required
Skills:
Industry trends, Data Analysis, Project Management
Job type:
Full-time
Salary:
negotiable
- Drive and execute initiatives to improve operational efficiency, drive cost savings, and enhance retail productivity through innovation, process improvement, and streamlined operations supporting company s and top management s directions.
- Monitor and evaluate store performance, financial data, and in-store execution, providing insights and recommendations to the team for continuous development of store operations and process enhancements.
- Leverage global best practices from across the world to implement innovative solutio ...
- Apply business insights, industry trends, and data analysis to guide the team in translating information into actionable initiatives that drive process standardization and improved performance.
- Coordinate and collaborate with cross-functional teams to ensure alignment on project goals and strategies, helping to resolve issues and ensure smooth execution of initiatives.
- Oversee project management efforts, tracking progress and ensuring timely completion of tasks. Provide regular status updates, identify potential risks, and support the team in implementing mitigation plans when necessary.
- Prepare reports and presentations for senior management to communicate project progress, performance metrics, and key developments, offering insights and recommendations for effective decision-making.
- Lead and develop team members, fostering a entrepreneurial mindset and empowering them to identify opportunities for continuous improvement, cost savings, and operational excellence across the organization.
- Degree in Business, Economics, Engineering, or related field.
- 5 years+ working experience in process improvement, project management, quantitative analysis, or cost savings.
- Experience as a consultant for internal / external clients, or experience in Retail sector is a plus.
- Six Sigma Green Belt certification is a plus.
- Ability to analyze financial, operational, and performance data to generate actionable insights and recommendations.
- Familiarity with data analysis tools (e.g., Power BI, Excel) for analyzing data and generating performance reports.
- Skill in managing and facilitating process changes, ensuring that improvements are implemented effectively and embraced by the team.
- Proficiency in preparing reports and presentations that summarize data, analysis, and project updates for management.
- Strong ability to communicate clearly and persuasively with senior management, team members, and cross-functional partners.
- Ability to mentor and coach team members, developing a strong, capable team with a focus on business optimization.
- CP AXTRA | Lotus's
- CP AXTRA Public Company Limited.
- Nawamin Office: Buengkum, Bangkok 10230, Thailand.
- By applying for this position, you consent to the collection, use and disclosure of your personal data to us, our recruitment firms and all relevant third parties for the purpose of processing your application for this job position (or any other suitable positions within Lotus's and its subsidiaries, if any). You understand and acknowledge that your personal data will be processed in accordance with the law and our policy.
Skills:
Project Management, SQL, Python, English
Job type:
Full-time
Salary:
negotiable
- Adjust language models that have already been trained for generative AI applications Ensure that the LLMs and pipelines based on LLMs are tuned and released.
- Create and implement LLMs for various content creation jobs Develop and communicate roadmaps for data science projects.
- Design effective agile workflows and manage a cycle of deliverables that meet timeline and resource constraints.
- Serve as a bridge between stakeholders and AI suppliers to facilitate seamless communication and understanding of project requirements.
- Work closely with external AI suppliers to ensure alignment between project goals and technological capabilities.
- Identify and gather data sets necessary for AI projects.
- Prior experience in Machine Learning, Deep Learning, and AI algorithm to solve respective business cases and pain points.
- Prior hands-on experience in data-mining techniques to better understand each pain point and provide insights.
- Able to design and conduct analysis to support product & channel improvement and development.
- Present key findings and recommendations to business counter parties and senior management on project approach and strategic planning.
- Bachelor degree or higher in Computer Science, Computer Engineering, Information Technology, Management Information System or an IT related field.
- Native Thai speaker & fluent in English.
- 3+ years of proven experience as a Data Scientist with a focus on project management (Retail or E-Commerce business is preferable).
- At least 2+ years of relevant experience as an LLM Data Scientist Experience in SQL and Python (Pandas, Numpy, SparkSQL).
- Ability to manipulate and analyze complex, high-volume, high-dimensionality data from varying sources.
- Experience in Big Data Technologies like Hadoop, Apache Spark, Databrick.
- Experience in machine learning and deep learning (Tensorflow, Keras, Scikit-learn).
- Good Knowledge of Statistics.
- Experience in Data Visualization (Tableau, PowerBI) is a plus.
- Excellent communication skills with the ability to convey complex findings to non-technical stakeholders.
- Having good attitude toward team working and willing to work hard.
Experience:
3 years required
Skills:
English
Job type:
Full-time
Salary:
negotiable
- Responsible for planning preventive maintenance schedules for air condition & Fire protection systems.
- Responsible for coordinating and managing vendors and suppliers to preventive maintenance and payment plans.
- 2nd Level support to Data Center Operation (FOC), on site to solve Incident and Problem management.
- 2nd Level support to engineer team all site, Data Center (TT1, TT2, MTG, BNA).
- To create & update reports and documents to comply with ISO 20k, 22k, 27k, 50k & TCOS standards.
- Review PUE, cost saving energy and report.
- Measured Efficiency air system and record annual report.
- Responsible for implementation of Mechanical such as comfort air, precision air.
- Responsible for implementation of Fire suppression such as FM200, NOVEC, CO2, Fire Sprinkler, Fire Pump, Fire alarm, and Vesda.
- Working period office time 9:00 - 18:00 and able to standby on call or onsite to work on holiday.
- Bachelor degree of Engineering, Mechanical engineering or related field.
- At Least 3 years experience in maintenance air conditioning such as comfort air, precision air, chiller air cool, water cool, pump motor: implement and support for mechanics-air condition systems in buildings or Data Centers.
- At least 1 years experience in designing air conditioners such as comfort air, precision air, chiller air cool, water cool, pump motor: implement, and support for mechanics-air condition systems in building.
- Able to Air - Diagram and Psychrometric chart knowledge.
- Able to work as a team and work in and standby on call on holiday.
- Able to work overtime if required and a hotline arrives (Less than 1 hour on site from your home).
- Proficiency in English communication is beneficial.
- Work Location: TrueIDC - Bangna Site (KM26).
Experience:
5 years required
Skills:
Scala, Java, Golang
Job type:
Full-time
Salary:
negotiable
- Lead the team technically in improving scalability, stability, accuracy, speed and efficiency of our existing Data systems.
- Build, administer and scale data processing pipelines.
- Be comfortable navigating the following technology stack: Scala, Spark, java, Golang, Python3, scripting (Bash/Python), Hadoop, SQL, S3 etc.
- Improve scalability, stability, accuracy, speed and efficiency of our existing data systems.
- Design, build, test and deploy new libraries, frameworks or full systems for our core systems while keeping to the highest standards of testing and code quality.
- Work with experienced engineers and product owners to identify and build tools to automate many large-scale data management / analysis tasks.
- What You'll need to Succeed.
- Bachelor's degree in Computer Science /Information Systems/Engineering/related field.
- 5+ years of experience in software and data engineering.
- Good experience in Apache Spark.
- Expert level understanding of JVM and either Java or Scala.
- Experience debugging and reasoning about production issues is desirable.
- A good understanding of data architecture principles preferred.
- Any other experience with Big Data technologies / tools.
- SQL experience.
- Analytical problem-solving capabilities & experience.
- Systems administration skills in Linux.
- It's great if you have.
- Good understanding of Hadoop ecosystems.
- Experience working with Open-source products.
- Python/Shell scripting skills.
- Working in an agile environment using test driven methodologies.
- Equal Opportunity Employer.
- At Agoda, we pride ourselves on being a company represented by people of all different backgrounds and orientations. We prioritize attracting diverse talent and cultivating an inclusive environment that encourages collaboration and innovation. Employment at Agoda is based solely on a person's merit and qualifications. We are committed to providing equal employment opportunity regardless of sex, age, race, color, national origin, religion, marital status, pregnancy, sexual orientation, gender identity, disability, citizenship, veteran or military status, and other legally protected characteristics.
- We will keep your application on file so that we can consider you for future vacancies and you can always ask to have your details removed from the file. For more details please read our privacy policy.
- To all recruitment agencies: Agoda does not accept third party resumes. Please do not send resumes to our jobs alias, Agoda employees or any other organization location. Agoda is not responsible for any fees related to unsolicited resumes.
Experience:
2 years required
Skills:
Risk Management, Microsoft Office, Data Analysis
Job type:
Full-time
Salary:
negotiable
- Transaction Monitoring:Analyze transactions in real-time using fraud detection tools and rules.
- Identify suspicious activity based on pre-defined risk profiles and behavioral patterns.
- Investigate flagged transactions and determine their legitimacy.
- Escalate high-risk cases to the Fraud Management team for further investigation.
- Fraud Investigation:Gather and analyze evidence related to suspected fraudulent activity.
- Conduct research to identify potential fraud schemes and perpetrators.
- Document findings and recommend appropriate actions, such as blocking accounts, recovering funds, or reporting to law enforcement.
- Collaborate with internal teams (customer support, risk management) to resolve cases effectively and efficiently.
- Data Analysis & Reporting:Analyze fraud trends and patterns to identify emerging threats and adjust detection rules accordingly.
- Generate reports on fraud activity, providing insights to the Fraud Management team and senior management.
- Track and measure the effectiveness of fraud prevention and detection measures.
- Stay Informed:Stay up-to-date on the latest fraud threats, trends, and best practices.
- Participate in ongoing training and development opportunities to enhance your skills and knowledge.
- Minimum of 2-3 years of experience in fraud analysis or a related field.
- Strong analytical and problem-solving skills.
- Excellent attention to detail and ability to identify anomalies in data.
- Proficient in Microsoft Office Suite,SQL language and data analysis tools.
- Understanding of fraud detection and prevention techniques preferred.
- Effective communication and interpersonal skills.
- Ability to work independently and as part of a team.
- Bachelor's degree in business administration, finance, IT, engineering, or a related field preferred.
Skills:
Automation, Product Owner, Python
Job type:
Full-time
Salary:
negotiable
- The candidate will be responsible for design, and implement new solutions for complex data ingestions from multiple sources to enterprise data products with focus on automation, performance, resilience and scalability, etc.
- Partner with Lead Architect, Data Product Manager (Product Owner) and Lead Data Integration Engineer to create strategic solutions introducing new technologies.
- Work with stakeholders including Management, Domain leads, and Teams to assist with data-related technical issues and support their data infrastructure needs.
- Strong development & programming experience in Informatica (IICS), Python, ADF, Azure Synapse, snowflake, Cosmos and Databricks.
- Solid understanding of databases, real-time integration patterns and ETL/ELT best practices.
- Defining data retention policies, monitoring performance and advising any necessary infrastructure changes based on functional and non-functional requirements.
- Responsible for ensuring enterprise data policies, best practices, standards and processes are followed.
- Write up and maintain technical specifications, design documents and process flow.
- Mentor a team of onshore and offshore development resources to analyze, design, construct and test software development projects focused on analytics and data integration.
- Elaborate user stories for technical team and ensure that the team understands the deliverables.
- Effectively communicate, coordinate & collaborate with business, IT architecture and data teams across multi-functional areas to complete deliverables.
- Provide direction to the Agile development team and stakeholders throughout the project.
- Assist in Data Architecture design, tool selection and data flows analysis.
- Work with large amounts of data, interpret data, analyze results, perform gap analysis and provide ongoing reports.
- Handle ad-hoc analysis & report generation requests from the business.
- Respond to data related inquiries to support business and technical teams.
- 6+years of proven working experience in ETL methodologies, Data integration and data migration. Informatica IICS, Databricks/Spark & Python hands-on development skills a must.
- Clear hands-on experience with database systems - SQL server, Oracle, Azure Synapse, Snowflake and Cosmos, Cloud technologies (e.g., AWS, Azure, Google), and NoSQL databases (e.g., Cosmos, MongoDB, DynamoDB).
- Extensive experience developing complex solutions focused on data ecosystem solutions.
- Extensive knowledge of data and analytics framework supporting data lakes, warehouses, marts, reporting, etc.
- In depth knowledge of data engineering and architecture disciplines.
- Extensive experience working with Big Data tools and building data solutions for advanced analytics.
- Solid understanding of P&C Insurance data.
- Technical expertise regarding data architecture, models and database design development.
- Strong knowledge of and experience with Java, SQL, XML s, Python, ETL frameworks and Databricks.
- Working knowledge/familiarity with Git version control.
- Strong Knowledge of analyzing datasets using Excel.
- Strong analytical skills with the ability to collect, organize, analyze, and disseminate significant amounts of information with attention to detail and accuracy.
- Proficient in learning new technologies with the ability to quickly understand capabilities and work with others to guide these into development.
- Good communication and presentation skills.
- Solid problem solving, decision making and analytical skills.
- Knowledge & working experience with Duck creek is an added plus.
- Knowledge & working experience with Insurity Policy Decisions and/or IEV is an added plus.
- Experience with JIRA.
- Experience being part of high-performance agile teams in a fast-paced environment.
- Must understand the system scope and project objectives to achieve project needs through matrix management and collaboration with other enterprise teams.
- Proven ability to produce results in the analysis, design, testing and deployment of applications.
- Strong team emphasis and relationship building skills; partners well with business and other IT/Data areas.
- Strong coaching / mentoring skills.
- Applies technical knowledge to determine solutions and solve complex problems.
- Ability to be proactive, self-motivated, detail-oriented, creative, inquisitive and persistent.
- Excellent communication and negotiation skills.
- Ability to organize, plan and implement work assignments, juggle competing demands and work under pressure of frequent and tight deadlines.
- The candidate will be responsible for design, and implement new solutions for complex data ingestions from multiple sources to enterprise data products with focus on automation, performance, resilience and scalability, etc.
- Partner with Lead Architect, Data Product Manager (Product Owner) and Lead Data Integration Engineer to create strategic solutions introducing new technologies.
- Work with stakeholders including Management, Domain leads, and Teams to assist with data-related technical issues and support their data infrastructure needs.
- Strong development & programming experience in Informatica (IICS), Python, ADF, Azure Synapse, snowflake, Cosmos and Databricks.
- Solid understanding of databases, real-time integration patterns and ETL/ELT best practices.
- Defining data retention policies, monitoring performance and advising any necessary infrastructure changes based on functional and non-functional requirements.
- Responsible for ensuring enterprise data policies, best practices, standards and processes are followed.
- Write up and maintain technical specifications, design documents and process flow.
- Mentor a team of onshore and offshore development resources to analyze, design, construct and test software development projects focused on analytics and data integration.
- Elaborate user stories for technical team and ensure that the team understands the deliverables.
- Effectively communicate, coordinate & collaborate with business, IT architecture and data teams across multi-functional areas to complete deliverables.
- Provide direction to the Agile development team and stakeholders throughout the project.
- Assist in Data Architecture design, tool selection and data flows analysis.
- Work with large amounts of data, interpret data, analyze results, perform gap analysis and provide ongoing reports.
- Handle ad-hoc analysis & report generation requests from the business.
- Respond to data related inquiries to support business and technical teams.
- 6+years of proven working experience in ETL methodologies, Data integration and data migration. Informatica IICS, Databricks/Spark & Python hands-on development skills a must.
- Clear hands-on experience with database systems - SQL server, Oracle, Azure Synapse, Snowflake and Cosmos, Cloud technologies (e.g., AWS, Azure, Google), and NoSQL databases (e.g., Cosmos, MongoDB, DynamoDB).
- Extensive experience developing complex solutions focused on data ecosystem solutions.
- Extensive knowledge of data and analytics framework supporting data lakes, warehouses, marts, reporting, etc.
- In depth knowledge of data engineering and architecture disciplines.
- Extensive experience working with Big Data tools and building data solutions for advanced analytics.
- Solid understanding of P&C Insurance data.
- Technical expertise regarding data architecture, models and database design development.
- Strong knowledge of and experience with Java, SQL, XML s, Python, ETL frameworks and Databricks.
- Working knowledge/familiarity with Git version control.
- Strong Knowledge of analyzing datasets using Excel.
- Strong analytical skills with the ability to collect, organize, analyze, and disseminate significant amounts of information with attention to detail and accuracy.
- Proficient in learning new technologies with the ability to quickly understand capabilities and work with others to guide these into development.
- Good communication and presentation skills.
- Solid problem solving, decision making and analytical skills.
- Knowledge & working experience with Duck creek is an added plus.
- Knowledge & working experience with Insurity Policy Decisions and/or IEV is an added plus.
- Experience with JIRA.
- Experience being part of high-performance agile teams in a fast-paced environment.
- Must understand the system scope and project objectives to achieve project needs through matrix management and collaboration with other enterprise teams.
- Proven ability to produce results in the analysis, design, testing and deployment of applications.
- Strong team emphasis and relationship building skills; partners well with business and other IT/Data areas.
- Strong coaching / mentoring skills.
- Applies technical knowledge to determine solutions and solve complex problems.
- Ability to be proactive, self-motivated, detail-oriented, creative, inquisitive and persistent.
- Excellent communication and negotiation skills.
- Ability to organize, plan and implement work assignments, juggle competing demands and work under pressure of frequent and tight deadlines.
Experience:
5 years required
Skills:
Python, ETL, Compliance
Job type:
Full-time
Salary:
negotiable
- Design and implement scalable, reliable, and efficient data pipelines for ingesting, processing, and storing large amounts of data from a variety of sources using cloud-based technologies, Python, and PySpark.
- Build and maintain data lakes, data warehouses, and other data storage and processing systems on the cloud.
- Write and maintain ETL/ELT jobs and data integration scripts to ensure smooth and accurate data flow.
- Implement data security and compliance measures to protect data privacy and ensure regulatory compliance.
- Collaborate with data scientists and analysts to understand their data needs and provide them with access to the required data.
- Stay up-to-date on the latest developments in cloud-based data engineering, particularly in the context of Azure, AWS and GCP, and proactively bring new ideas and technologies to the team.
- Monitor and optimize the performance of data pipelines and systems, identifying and resolving any issues or bottlenecks that may arise.
- Bachelor s or Master s degree in Computer Science, Data Science, or a related field.
- Minimum of 5 years of experience as a Data Engineer, with a strong focus on cloud-based data infrastructure.
- Proficient programming skills in Python, Java, or a similar language, with an emphasis on Python.
- Extensive experience with cloud-based data storage and processing technologies, particularly Azure, AWS and GCP.
- Familiarity with ETL/ELT tools and frameworks such as Apache Beam, Apache Spark, or Apache Flink.
- Knowledge of data modeling principles and experience working with SQL databases.
- Strong problem-solving skills and the ability to troubleshoot and resolve issues efficiently.
- Excellent communication and collaboration skills to work effectively with cross-functional teams.
- Location: True Digital Park, Bangkok (Hybrid working).
Skills:
ETL, Big Data, SQL
Job type:
Full-time
Salary:
negotiable
- Design and develop ETL solutions using data integration tools for interfacing between source application and the Enterprise Data Warehouse.
- Experience with Big Data or data warehouse.
- Analyze & translate functional specifications & change requests into technical specifications.
- Experience in SQL programming in one of these RDBMSs such as Oracle.
- Develops ETL technical specifications, designs, develops, tests, implements, and supports optimal data solutions.
- Develops Documents ETL data mappings, data dictionaries, processes, programs and solutions as per established standards for data governance.
- Design and create codes for all related data extraction, transformation and loading (ETL) into database under responsibilities.
- Creates, executes, and documents unit test plans for ETL and data integration processes and programs.
- Perform problem assessment, resolution and documentation in existing ETL packages, mapping and workflows in production.
- Performance tuning of the ETL process and SQL queries and recommend and implement ETL and query tuning techniques.
- Qualifications Bachelor s degree in Information Technology, Computer Science, Computer Engineering, or a related field.
- Experience with CI/CD tools (e.g., Jenkins, GitLab CI/CD) and automation frameworks.
- Proficiency in cloud platforms such as AWS, Azure, and associated services.
- Knowledge of IAC tools like Terraform or Azure ARM.
- Familiarity with monitoring and logging tools like Prometheus, Grafana, ELK stack, APM, etc.
- Good understanding of IT Operations.
- Strong problem-solving skills and the ability to troubleshoot complex issues.
- Excellent communication and teamwork skills to collaborate effectively across various teams.
- We're committed to bringing passion and customer focus to the business. If you like wild growth and working with happy, enthusiastic over-achievers, you'll enjoy your career with us.
Experience:
2 years required
Skills:
Research, Python, SQL
Job type:
Full-time
Salary:
negotiable
- Develop machine learning models such as credit model, income estimation model and fraud model.
- Research on cutting-edge technology to enhance existing model performance.
- Explore and conduct feature engineering on existing data set (telco data, retail store data, loan approval data).
- Develop sentimental analysis model in order to support collection strategy.
- Bachelor Degree in Computer Science, Operations Research, Engineering, or related quantitative discipline.
- 2-5 years of experiences in programming languages such as Python, SQL or Scala.
- 5+ years of hands-on experience in building & implementing AI/ML solutions for senior role.
- Experience with python libraries - Numpy, scikit-learn, OpenCV, Tensorflow, Pytorch, Flask, Django.
- Experience with source version control (Git, Bitbucket).
- Proven knowledge on Rest API, Docker, Google Big Query, VScode.
- Strong analytical skills and data-driven thinking.
- Strong understanding of quantitative analysis methods in relation to financial institutions.
- Ability to clearly communicate modeling results to a wide range of audiences.
- Nice to have.
- Experience in image processing or natural language processing (NLP).
- Solid understanding in collection model.
- Familiar with MLOps concepts.
Skills:
Data Analysis, SQL, Problem Solving, English
Job type:
Full-time
Salary:
negotiable
- Working closely with business and technical domain experts to identify data requirements that are relevant for analytics and business intelligence.
- Implement data solutions and data comprehensiveness for data customers.
- Working closely with engineering to ensure data service solutions are ultimately delivered in a timely and cost effective manner.
- Retrieve and prepare data (automated if possible) to support business data analysis.
- Ensure adherence to the highest standards in data privacy protection and data governance.
- Bachelor s of Master s Degree in Computer Science, Computer Engineering, or related.
- Minimum of 1 Year with relational/non-relational database systems and good command in SQL.
- Ability to meet critical deadlines and prioritize multiple tasks in a fast-paced environment.
- Ability to work independently, have strong problem solving and organization skills, with a high initiative and a sense of accountability and ownership.
- Experience with cloud-based platforms such as AWS, Google Cloud platform or similar.
- Experience in data processing batch / real time / nearly realtime.
- Experience with data integration or ETL management tools such as AWS Glue, Databrick,or similar.
- Experience with web or software development with Java,Python or similar.
- Experience with Agile methodology is a plus.
- Good in communication and writing in English.
- Good interpersonal and communication skills.
Experience:
5 years required
Skills:
Contracts, Compliance, Project Management, English
Job type:
Full-time
Salary:
negotiable
- Oversee and coordinate the maintenance and operation of critical infrastructure systems, including but not limited to HV and LV distribution systems, associated plant/equipment, HVAC mechanical cooling/heating systems, fire protection and suppression systems, and electrical and mechanical systems within the portfolio of buildings.
- Manage maintenance contracts and monitor contractor performance to ensure compliance with service level agreements.
- Provide technical support to the maintenance and operations teams, ensuring that all ...
- Monitor and optimize the performance of critical infrastructure systems, ensuring that they operate at peak efficiency and reliability.
- Develop and implement maintenance programs and procedures to ensure that critical infrastructure systems are maintained in accordance with best practice standards.
- Prepare and maintain accurate records of all maintenance and engineering work, including maintenance schedules, work orders, and engineering drawings.
- Develop and maintain relationships with key stakeholders, including Facilities Managers, the Client's staff and representatives, contractors, and suppliers.
- Participate in emergency call-out roster providing cover for weekend and team member absences, as required.
- Volunteer ideas/initiatives that contribute to the service levels and delivery.
- Undertake other tasks, as required by the Client, in accordance with experience and competencies.
- Bachelor's degree in Mechanical/Electrical Engineering or related field.
- Minimum of 5 years of experience in critical environment or data centre operations and maintenance.
- Experience in managing maintenance contracts and monitoring contractor performance.
- Strong technical knowledge of critical infrastructure systems, including but not limited to HV and LV distribution systems, associated plant/equipment, HVAC mechanical cooling/heating systems, fire protection and suppression systems, and electrical and mechanical systems.
- Excellent problem-solving skills, with the ability to identify, diagnose and solve technical issues.
- Excellent English & Thai communication skills, with the ability to communicate technical information to non-technical stakeholders.
- Strong project management skills, with the ability to manage multiple projects simultaneously.
- Knowledge of safety and environmental regulations and standards.
- Ability to work under pressure and in a fast-paced environment.
Experience:
5 years required
Skills:
AutoCAD, Visio, English
Job type:
Full-time
Salary:
negotiable
- Responsible for planning preventive maintenance schedules for the electrical system.
- Responsible for coordinating and managing vendors and suppliers to preventive maintenance and payment plans.
- 2nd Level support to Data Center Operation (FOC), on site to solve Incident and Problem management.
- 2nd Level support to engineer team all site, Data Center (TT1, TT2, MTG, BNA).
- To create & update reports and documents to comply with ISO 20k, 22k, 27k, 50k & TCOS standards.
- Review PUE, cost saving energy and report.
- Measured Efficiency air system and record annual report.
- Responsible for implementing Electrical such as MU, TR, MDB, GEN, UPS, RECT, BATT, ATS.
- Bachelor degree of Engineering, Electrical engineering or related field.
- More than 5 years of experience in maintenance of electrical systems such as RMU, TR, MDB, GEN, UPS, RECT, BATT, ATS: implement and support electrical systems in buildings or Data Centers.
- At least 1 years experience in designing electrical systems (such as RMU, TR, MDB, GEN, UPS, RECT, BATT, ATS). implement, and support for electrical systems in building.
- Able to use the program AutoCAD, Visio.
- Able to work as a team and work in and standby on call on holiday.
- Able to work overtime if required and a hotline arrives (Less than 1 hour on site from your home).
- Proficiency in English communication is beneficial for both reading and writing.
- Work Location: TrueIDC - Bangna Site (KM26).
- 1
- 2
- 3
- 4