- No elements found. Consider changing the search query.
ทักษะ:
SQL, Oracle, Data Warehousing
ประเภทงาน:
งานประจำ
เงินเดือน:
สามารถต่อรองได้
- Bachelor s degree in Computer Science, Information Systems, Engineering, or a related field.
- At least 7 years of experience as a Data Engineer or in a related role.
- Hands-on experience with SQL, database management (e.g., Oracle, SQL Server, PostgreSQL), and data warehousing concepts.
- Experience with ETL/ELT tools such as Talend, Apache NiFi, or similar.
- Proficiency in programming languages like Python, Java, or Scala for data manipulation and automation.
- Experience with cloud platforms such as AWS, Azure, or GCP.
- Knowledge of big data technologies such as Hadoop, Spark, or Kafka.
- Strong understanding of data governance, security, and privacy frameworks in a financial services context.
- Excellent problem-solving skills and attention to detail.
- Experience working with Data Visualization or BI tools like Power BI, Tableau.
- Familiarity with machine learning concepts, model deployment, and AI applications.
- Banking or financial services industry experience, especially in retail or wholesale banking data solutions.
- Certification in cloud platforms (e.g., AWS Certified Data Engineer, Microsoft Azure Data Engineer, Google Professional Data Engineer)..
- Contact:.
- ท่านสามารถอ่านและศึกษานโยบายความเป็นส่วนตัวของธนาคารกรุงไทย จำกัด (มหาชน) ที่ https://krungthai.com/th/content/privacy-policy ทั้งนี้ ธนาคารไม่มีเจตนาหรือความจำเป็นใดๆ ที่จะประมวลผลข้อมูลส่วนบุคคลที่มีความอ่อนไหว รวมถึงข้อมูลที่เกี่ยวข้องศาสนาและ/หรือหมู่โลหิต ซึ่งอาจปรากฏอยู่ในสำเนาบัตรประจำตัวประชาชนของท่านแต่อย่างใด ดังนั้น กรุณาอย่าอัปโหลดเอกสารใดๆ รวมถึงสำเนาบัตรประจำตัวประชาชน หรือกรอกข้อมูลส่วนบุคคลที่มีความอ่อนไหวหรือข้อมูลอื่นใด ซึ่งไม่เกี่ยวข้องหรือไม่จำเป็นสำหรับวัตถุประสงค์ในการสมัครงานไว้บนเว็บไซต์ นอกจากนี้ กรุณาดำเนินการให้แน่ใจว่าได้ดำเนินการลบข้อมูลส่วนบุคคลที่มีความอ่อนไหว (ถ้ามี) ออกจากเรซูเม่และเอกสารอื่นใดก่อนที่จะอัปโหลดเอกสารดังกล่าวไว้บนเว็บไซต์แล้วด้วย ทั้งนี้ ธนาคารมีความจำเป็นต้องเก็บรวบรวมข้อมูลส่วนบุคคลเกี่ยวกับประวัติอาชญากรรมของท่านเพื่อบรรลุวัตถุประสงค์ในการพิจารณารับบุคคลเข้าทำงาน หรือการตรวจสอบคุณสมบัติ ลักษณะต้องห้าม หรือพิจารณาความเหมาะสมของบุคคลที่จะให้ดำรงตำแหน่ง ซึ่งการให้ความยินยอมเพื่อเก็บรวบรวม ใช้ หรือเปิดเผยข้อมูลส่วนบุคคลเกี่ยวกับประวัติอาชญากรรมของท่านมีความจำเป็นสำหรับการเข้าทำสัญญาและการได้รับการพิจารณาตามวัตถุประสงค์ดังกล่าวข้างต้น ในกรณีที่ท่านไม่ให้ความยินยอมในการเก็บรวบรวม ใช้ หรือเปิดเผยข้อมูลส่วนบุคคลเกี่ยวกับประวัติอาชญากรรม หรือมีการถอนความยินยอมในภายหลัง ธนาคารอาจไม่สามารถดำเนินการเพื่อบรรลุวัตถุประสงค์ดังกล่าวข้างต้นได้ และอาจ ทำให้ท่านสูญเสียโอกาสในการได้รับการพิจารณารับเข้าทำงานกับธนาคาร .
ทักษะ:
Automation, Product Owner, Python
ประเภทงาน:
งานประจำ
เงินเดือน:
สามารถต่อรองได้
- The candidate will be responsible for design, and implement new solutions for complex data ingestions from multiple sources to enterprise data products with focus on automation, performance, resilience and scalability, etc.
- Partner with Lead Architect, Data Product Manager (Product Owner) and Lead Data Integration Engineer to create strategic solutions introducing new technologies.
- Work with stakeholders including Management, Domain leads, and Teams to assist with data-related technical issues and support their data infrastructure needs.
- Strong development & programming experience in Informatica (IICS), Python, ADF, Azure Synapse, snowflake, Cosmos and Databricks.
- Solid understanding of databases, real-time integration patterns and ETL/ELT best practices.
- Defining data retention policies, monitoring performance and advising any necessary infrastructure changes based on functional and non-functional requirements.
- Responsible for ensuring enterprise data policies, best practices, standards and processes are followed.
- Write up and maintain technical specifications, design documents and process flow.
- Mentor a team of onshore and offshore development resources to analyze, design, construct and test software development projects focused on analytics and data integration.
- Elaborate user stories for technical team and ensure that the team understands the deliverables.
- Effectively communicate, coordinate & collaborate with business, IT architecture and data teams across multi-functional areas to complete deliverables.
- Provide direction to the Agile development team and stakeholders throughout the project.
- Assist in Data Architecture design, tool selection and data flows analysis.
- Work with large amounts of data, interpret data, analyze results, perform gap analysis and provide ongoing reports.
- Handle ad-hoc analysis & report generation requests from the business.
- Respond to data related inquiries to support business and technical teams.
- 6+years of proven working experience in ETL methodologies, Data integration and data migration. Informatica IICS, Databricks/Spark & Python hands-on development skills a must.
- Clear hands-on experience with database systems - SQL server, Oracle, Azure Synapse, Snowflake and Cosmos, Cloud technologies (e.g., AWS, Azure, Google), and NoSQL databases (e.g., Cosmos, MongoDB, DynamoDB).
- Extensive experience developing complex solutions focused on data ecosystem solutions.
- Extensive knowledge of data and analytics framework supporting data lakes, warehouses, marts, reporting, etc.
- In depth knowledge of data engineering and architecture disciplines.
- Extensive experience working with Big Data tools and building data solutions for advanced analytics.
- Solid understanding of P&C Insurance data.
- Technical expertise regarding data architecture, models and database design development.
- Strong knowledge of and experience with Java, SQL, XML s, Python, ETL frameworks and Databricks.
- Working knowledge/familiarity with Git version control.
- Strong Knowledge of analyzing datasets using Excel.
- Strong analytical skills with the ability to collect, organize, analyze, and disseminate significant amounts of information with attention to detail and accuracy.
- Proficient in learning new technologies with the ability to quickly understand capabilities and work with others to guide these into development.
- Good communication and presentation skills.
- Solid problem solving, decision making and analytical skills.
- Knowledge & working experience with Duck creek is an added plus.
- Knowledge & working experience with Insurity Policy Decisions and/or IEV is an added plus.
- Experience with JIRA.
- Experience being part of high-performance agile teams in a fast-paced environment.
- Must understand the system scope and project objectives to achieve project needs through matrix management and collaboration with other enterprise teams.
- Proven ability to produce results in the analysis, design, testing and deployment of applications.
- Strong team emphasis and relationship building skills; partners well with business and other IT/Data areas.
- Strong coaching / mentoring skills.
- Applies technical knowledge to determine solutions and solve complex problems.
- Ability to be proactive, self-motivated, detail-oriented, creative, inquisitive and persistent.
- Excellent communication and negotiation skills.
- Ability to organize, plan and implement work assignments, juggle competing demands and work under pressure of frequent and tight deadlines.
- The candidate will be responsible for design, and implement new solutions for complex data ingestions from multiple sources to enterprise data products with focus on automation, performance, resilience and scalability, etc.
- Partner with Lead Architect, Data Product Manager (Product Owner) and Lead Data Integration Engineer to create strategic solutions introducing new technologies.
- Work with stakeholders including Management, Domain leads, and Teams to assist with data-related technical issues and support their data infrastructure needs.
- Strong development & programming experience in Informatica (IICS), Python, ADF, Azure Synapse, snowflake, Cosmos and Databricks.
- Solid understanding of databases, real-time integration patterns and ETL/ELT best practices.
- Defining data retention policies, monitoring performance and advising any necessary infrastructure changes based on functional and non-functional requirements.
- Responsible for ensuring enterprise data policies, best practices, standards and processes are followed.
- Write up and maintain technical specifications, design documents and process flow.
- Mentor a team of onshore and offshore development resources to analyze, design, construct and test software development projects focused on analytics and data integration.
- Elaborate user stories for technical team and ensure that the team understands the deliverables.
- Effectively communicate, coordinate & collaborate with business, IT architecture and data teams across multi-functional areas to complete deliverables.
- Provide direction to the Agile development team and stakeholders throughout the project.
- Assist in Data Architecture design, tool selection and data flows analysis.
- Work with large amounts of data, interpret data, analyze results, perform gap analysis and provide ongoing reports.
- Handle ad-hoc analysis & report generation requests from the business.
- Respond to data related inquiries to support business and technical teams.
- 6+years of proven working experience in ETL methodologies, Data integration and data migration. Informatica IICS, Databricks/Spark & Python hands-on development skills a must.
- Clear hands-on experience with database systems - SQL server, Oracle, Azure Synapse, Snowflake and Cosmos, Cloud technologies (e.g., AWS, Azure, Google), and NoSQL databases (e.g., Cosmos, MongoDB, DynamoDB).
- Extensive experience developing complex solutions focused on data ecosystem solutions.
- Extensive knowledge of data and analytics framework supporting data lakes, warehouses, marts, reporting, etc.
- In depth knowledge of data engineering and architecture disciplines.
- Extensive experience working with Big Data tools and building data solutions for advanced analytics.
- Solid understanding of P&C Insurance data.
- Technical expertise regarding data architecture, models and database design development.
- Strong knowledge of and experience with Java, SQL, XML s, Python, ETL frameworks and Databricks.
- Working knowledge/familiarity with Git version control.
- Strong Knowledge of analyzing datasets using Excel.
- Strong analytical skills with the ability to collect, organize, analyze, and disseminate significant amounts of information with attention to detail and accuracy.
- Proficient in learning new technologies with the ability to quickly understand capabilities and work with others to guide these into development.
- Good communication and presentation skills.
- Solid problem solving, decision making and analytical skills.
- Knowledge & working experience with Duck creek is an added plus.
- Knowledge & working experience with Insurity Policy Decisions and/or IEV is an added plus.
- Experience with JIRA.
- Experience being part of high-performance agile teams in a fast-paced environment.
- Must understand the system scope and project objectives to achieve project needs through matrix management and collaboration with other enterprise teams.
- Proven ability to produce results in the analysis, design, testing and deployment of applications.
- Strong team emphasis and relationship building skills; partners well with business and other IT/Data areas.
- Strong coaching / mentoring skills.
- Applies technical knowledge to determine solutions and solve complex problems.
- Ability to be proactive, self-motivated, detail-oriented, creative, inquisitive and persistent.
- Excellent communication and negotiation skills.
- Ability to organize, plan and implement work assignments, juggle competing demands and work under pressure of frequent and tight deadlines.
ประสบการณ์:
3 ปีขึ้นไป
ทักษะ:
Big Data, Hive, SAS
ประเภทงาน:
งานประจำ
เงินเดือน:
สามารถต่อรองได้
- Design, implement, and maintain data analytics pipelines and processing systems.
- Experience of data modelling techniques and integration patterns.
- Write data transformation jobs through code.
- Analyze large datasets to extract insights and identify trends.
- Perform data management through data quality tests, monitoring, cataloging, and governance.
- Knowledge in data infrastructure ecosystem.
- Collaborate with cross-functional teams to identify opportunities to leverage data to drive business outcomes.
- Build data visualizations to communicate findings to stakeholders.
- A willingness to learn and find solutions to complex problems.
- Stay up-to-date with the latest developments in data analytics and science.
- Experience migrating from on-premise data stores to cloud solutions.
- Knowledge of system design and platform thinking to build sustainable solution.
- Practical experience with modern and traditional Big Data stacks (e.g BigQuery, Spark, Databricks, duckDB, Impala, Hive, etc).
- Experience working with data warehouse solutions ELT solutions, tools, and techniques (e.g. Airflow, dbt, SAS, Matillion, Nifi).
- Experience with agile software delivery and CI/CD processes.
- Bachelor's or Master's degree in computer science, statistics, engineering, or a related field.
- At least 3 years of experience in data analysis and modeling.
- Proficiency in Python, and SQL.
- Experience with data visualization tools such as Tableau, Grafana or similar.
- Familiarity with cloud computing platforms, such as GCP, AWS or Databricks.
- Strong problem-solving skills and the ability to work independently as well as collaboratively.
- This role offers a clear path to advance into machine learning and AI with data quality and management, providing opportunities to work on innovative projects and develop new skills in these exciting fields..
- Contact: [email protected] (K.Thipwimon).
- ท่านสามารถอ่านและศึกษานโยบายความเป็นส่วนตัวของธนาคารกรุงไทย จำกัด (มหาชน) ที่ https://krungthai.com/th/content/privacy-policy ทั้งนี้ ธนาคารไม่มีเจตนาหรือความจำเป็นใดๆ ที่จะประมวลผลข้อมูลส่วนบุคคลที่มีความอ่อนไหว รวมถึงข้อมูลที่เกี่ยวข้องศาสนาและ/หรือหมู่โลหิต ซึ่งอาจปรากฏอยู่ในสำเนาบัตรประจำตัวประชาชนของท่านแต่อย่างใด ดังนั้น กรุณาอย่าอัปโหลดเอกสารใดๆ รวมถึงสำเนาบัตรประจำตัวประชาชน หรือกรอกข้อมูลส่วนบุคคลที่มีความอ่อนไหวหรือข้อมูลอื่นใด ซึ่งไม่เกี่ยวข้องหรือไม่จำเป็นสำหรับวัตถุประสงค์ในการสมัครงานไว้บนเว็บไซต์ นอกจากนี้ กรุณาดำเนินการให้แน่ใจว่าได้ดำเนินการลบข้อมูลส่วนบุคคลที่มีความอ่อนไหว (ถ้ามี) ออกจากเรซูเม่และเอกสารอื่นใดก่อนที่จะอัปโหลดเอกสารดังกล่าวไว้บนเว็บไซต์แล้วด้วย ทั้งนี้ ธนาคารมีความจำเป็นต้องเก็บรวบรวมข้อมูลส่วนบุคคลเกี่ยวกับประวัติอาชญากรรมของท่านเพื่อบรรลุวัตถุประสงค์ในการพิจารณารับบุคคลเข้าทำงาน หรือการตรวจสอบคุณสมบัติ ลักษณะต้องห้าม หรือพิจารณาความเหมาะสมของบุคคลที่จะให้ดำรงตำแหน่ง ซึ่งการให้ความยินยอมเพื่อเก็บรวบรวม ใช้ หรือเปิดเผยข้อมูลส่วนบุคคลเกี่ยวกับประวัติอาชญากรรมของท่านมีความจำเป็นสำหรับการเข้าทำสัญญาและการได้รับการพิจารณาตามวัตถุประสงค์ดังกล่าวข้างต้น ในกรณีที่ท่านไม่ให้ความยินยอมในการเก็บรวบรวม ใช้ หรือเปิดเผยข้อมูลส่วนบุคคลเกี่ยวกับประวัติอาชญากรรม หรือมีการถอนความยินยอมในภายหลัง ธนาคารอาจไม่สามารถดำเนินการเพื่อบรรลุวัตถุประสงค์ดังกล่าวข้างต้นได้ และอาจ ทำให้ท่านสูญเสียโอกาสในการได้รับการพิจารณารับเข้าทำงานกับธนาคาร .
ประสบการณ์:
3 ปีขึ้นไป
ทักษะ:
Microsoft Azure, SQL, UNIX, Python, Hadoop
ประเภทงาน:
งานประจำ
เงินเดือน:
สามารถต่อรองได้
- Develop data pipeline automation using Azure technologies, Databricks and Data Factory.
- Understand data, reports and dashboards requirements, develop data visualization using Power BI, Tableau by working across workstreams to support data requirements including reports and dashboards and collaborate with data scientists, data analyst, data governance team or business stakeholders on several projects.
- Analyze and perform data profiling to understand data patterns following Data Qualit ...
- 3 years+ experience in big data technology, data engineering, data analytic application system development.
- Have an experience of unstructured data for business Intelligence or computer science would be advantage.
- Technical skill in SQL, UNIX and Shell Script, Python, R Programming, Spark, Hadoop programming.
ประสบการณ์:
ไม่จำเป็นต้องมีประสบการณ์ทำงาน
ทักษะ:
Mechanical Engineering, Electrical Engineering, English
ประเภทงาน:
งานประจำ
- Provide day to day installation, maintenance, and repair of all facilities in the data center.
- 24x7 shift work responsibility when qualified and designated.
- Provide requested reporting and documentation.
- Support of facility, development, and construction teams.
- Perform tasks as assigned by DC operation manager.
- Respond to customer requests, power, cooling, and facility audits.
- First tier investigate any power, communication, or cooling anomalies.
- Attend assigned meetings and training.
- Assist in ensuring customer compliance with GSA Acceptance Usage Policy (AUP).
- Provide technical escort when needed.
- Job Qualifications.
- Must be familiar with safety requirements and OSHA regulations or Thailand safety regulations.
- Basic understanding of electrical and mechanical systems that may be employed in a data center environment. This may include electrical feeders, transformers, generators, switchgear, UPS systems, DC power systems, ATS/STS units, PDU units, air handling units, cooling towers, and fire suppression systems.
- Familiar with Interpret wiring diagrams, schematics, and electrical drawings.
- Ability to express ideas clearly, concisely, and effectively with contractors performing maintenance or upgrades on systems installed in the data center environment.
- Excellent verbal, written, and interpersonal communication skills.
- Ability to analyze and make suggestions for problem resolution.
- Solve problems with good initiative and sound judgment.
- Creativity, problem solving skills, negotiation and systematic thinking.
- Fluent in English both written and verbal (Minimum 500 TOEIC score).
- Goal-Oriented, Unity, Learning, Flexible.
ทักษะ:
Database Administration
ประเภทงาน:
งานประจำ
เงินเดือน:
สามารถต่อรองได้
- นำการดำเนินงานด้านกรอบการกำกับดูแลข้อมูล นโยบาย และกระบวนการที่กำหนดโดยองค์กร.
- ทำงานร่วมกับเจ้าของข้อมูลและผู้ดูแลข้อมูลเพื่อกำหนดมาตรฐานข้อมูล แนวปฏิบัติในการจัดการข้อมูล และแนวทางการใช้งานข้อมูล.
- ตรวจสอบให้สอดคล้องกับข้อกำหนดด้านกฎหมายและมาตรฐานที่เกี่ยวข้อง (เช่น GDPR, PDPA).
- การจัดการคุณภาพของข้อมูล (Data Quality Management)กำหนดและติดตามตัวชี้วัดคุณภาพข้อมูล (เช่น ความถูกต้อง ความครบถ้วน ความทันเวลา).
- ใช้เครื่องมือและกระบวนการจัดการคุณภาพข้อมูลเพื่อแก้ไขความผิดพลาดและพัฒนาความน่าเชื่อถือของข้อมูล.
- ประสานงานกับหน่วยงานธุรกิจเพื่อแก้ปัญหาและป้องกันปัญหาคุณภาพข้อมูลในอนาคต.
- การจัดการเมตาดาต้าและข้อมูลหลัก (Metadata and Master Data Management)พัฒนาและดูแลคลังเมตาดาต้าและพจนานุกรมข้อมูลขององค์กร.
- ดูแลการจัดการข้อมูลหลัก (Master Data) เพื่อให้ข้อมูลมีความสอดคล้องในระบบและกระบวนการ.
- การทำงานร่วมกับผู้มีส่วนได้ส่วนเสียเป็นตัวกลางระหว่างหน่วยงานธุรกิจ ทีมไอที และทีมกำกับดูแล เพื่อสร้างวัฒนธรรมความรับผิดชอบต่อข้อมูล.
- เป็นผู้นำในการจัดประชุมคณะกรรมการกำกับดูแลข้อมูลและกลุ่มทำงาน.
- จัดอบรมและให้การสนับสนุนแก่ผู้มีส่วนได้ส่วนเสีย เพื่อยกระดับความเข้าใจในข้อมูลและปฏิบัติตามแนวทางการกำกับดูแล.
- การบริหารจัดการความเสี่ยงและการปฏิบัติตามกฎระเบียบระบุความเสี่ยงที่เกี่ยวข้องกับข้อมูลและเสนอแนวทางการแก้ไข.
- ตรวจสอบให้การใช้งานข้อมูลสอดคล้องกับเป้าหมายขององค์กร กฎหมาย และมาตรฐานทางจริยธรรม.
- เป็นผู้นำในการตรวจสอบและประเมินผลด้านการกำกับดูแลข้อมูล..
- มีประสบการณ์ทำงานที่เกี่ยวข้องกับการกำกับดูแลและการจัดการคุณภาพของข้อมูล.
- ปริญญาตรีสาขาบริหารธุรกิจ วิทยาการคอมพิวเตอร์ คอมพิวเตอร์ธุรกิจ เทคโนโลยีสารสนเทศ หรือ สาขาอื่น ๆ ที่เกี่ยวข้อง.
- ติดต่อสอบถาม.
- สำนักทรัพยากรบุคคล
- บริษัท ไทยเบฟเวอเรจ จำกัด (มหาชน)
- อาคารเล้าเป้งง้วน 1 333 ถนน วิภาวดีรังสิต จอมพล เขตจตุจักร กรุงเทพมหานคร 10900.
ประสบการณ์:
5 ปีขึ้นไป
ทักษะ:
Scala, Java, Golang
ประเภทงาน:
งานประจำ
เงินเดือน:
สามารถต่อรองได้
- Lead the team technically in improving scalability, stability, accuracy, speed and efficiency of our existing Data systems.
- Build, administer and scale data processing pipelines.
- Be comfortable navigating the following technology stack: Scala, Spark, java, Golang, Python3, scripting (Bash/Python), Hadoop, SQL, S3 etc.
- Improve scalability, stability, accuracy, speed and efficiency of our existing data systems.
- Design, build, test and deploy new libraries, frameworks or full systems for our core systems while keeping to the highest standards of testing and code quality.
- Work with experienced engineers and product owners to identify and build tools to automate many large-scale data management / analysis tasks.
- What You'll need to Succeed.
- Bachelor's degree in Computer Science /Information Systems/Engineering/related field.
- 5+ years of experience in software and data engineering.
- Good experience in Apache Spark.
- Expert level understanding of JVM and either Java or Scala.
- Experience debugging and reasoning about production issues is desirable.
- A good understanding of data architecture principles preferred.
- Any other experience with Big Data technologies / tools.
- SQL experience.
- Analytical problem-solving capabilities & experience.
- Systems administration skills in Linux.
- It's great if you have.
- Good understanding of Hadoop ecosystems.
- Experience working with Open-source products.
- Python/Shell scripting skills.
- Working in an agile environment using test driven methodologies.
- Equal Opportunity Employer.
- At Agoda, we pride ourselves on being a company represented by people of all different backgrounds and orientations. We prioritize attracting diverse talent and cultivating an inclusive environment that encourages collaboration and innovation. Employment at Agoda is based solely on a person's merit and qualifications. We are committed to providing equal employment opportunity regardless of sex, age, race, color, national origin, religion, marital status, pregnancy, sexual orientation, gender identity, disability, citizenship, veteran or military status, and other legally protected characteristics.
- We will keep your application on file so that we can consider you for future vacancies and you can always ask to have your details removed from the file. For more details please read our privacy policy.
- To all recruitment agencies: Agoda does not accept third party resumes. Please do not send resumes to our jobs alias, Agoda employees or any other organization location. Agoda is not responsible for any fees related to unsolicited resumes.
ทักษะ:
Industry trends, Statistics, Python
ประเภทงาน:
งานประจำ
เงินเดือน:
สามารถต่อรองได้
- Develop and execute a forward-thinking analytics strategy tailored to the retail industry, focusing on leveraging data platforms to drive revenue growth, operational efficiency, and customer satisfaction.
- Lead, mentor, and inspire a team of data scientists and analysts, fostering a culture of innovation, collaboration, and data-driven decision-making.
- Stay ahead of industry trends, emerging technologies, and best practices in data science and retail analytics to maintain CP Axtra s competitive edge.
- Analytics Execution.
- Oversee the integration of diverse data sources, including POS systems, CRM platforms, online transactions, and third-party providers, into our cloud-based data platform.
- Design and develop advanced machine learning models, algorithms, and statistical analyses to uncover actionable insights related to customer behavior, product performance, and market trends.
- Apply expertise in recommendation and personalization algorithms to enhance customer experiences and engagement.
- Deliver data-driven solutions to optimize pricing strategies, inventory management, and promotional campaigns, leveraging state-of-the-art analytics tools and methodologies.
- Business Partnership.
- Partner closely with retail operations, marketing, and sales teams to understand business challenges and provide tailored analytical support that aligns with strategic objectives.
- Identify opportunities to enhance customer segmentation, personalized marketing efforts, and customer retention strategies through advanced data science techniques.
- Act as a key advisor to senior leadership, translating complex data insights into actionable recommendations and business value.
- Performance Monitoring and Optimization.
- Define and monitor key performance indicators (KPIs) related to retail operations, such as sales conversion rates, customer lifetime value, and basket analysis.
- Leverage analytics to continuously assess and optimize business processes, driving operational efficiency and profitability.
- Communication and Presentation.
- Present complex analytical findings, models, and recommendations to stakeholders in a clear, impactful, and visually compelling manner.
- Collaborate across departments to implement data-driven initiatives that align with CPaxtra s goals and drive tangible outcomes.
- Education and Experience.
- Bachelor s degree in Statistics, Mathematics, Computer Science, Data Science, Economics, or a related field (Master s or PhD strongly preferred).
- Extensive experience in analytics, data science, or business intelligence roles, with significant exposure to the retail industry.
- Technical Skills.
- Advanced proficiency in Python, R, SQL, and machine learning frameworks.
- Expertise in data visualization tools (e.g., Tableau, Power BI) and cloud-based data platforms (e.g., AWS, GCP, Azure).
- In-depth knowledge of big data technologies (e.g., Spark, Hadoop) and modern data engineering practices.
- Strong understanding of recommendation/personalization algorithms and data processing technologies.
- Leadership and Business Acumen.
- Proven ability to lead high-performing teams in a dynamic, fast-paced environment.
- Exceptional strategic thinking and problem-solving skills with a demonstrated focus on delivering business value.
- Deep understanding of retail operations, including inventory management, customer journey mapping, and merchandising strategies.
- CP AXTRA | Lotus's
- CP AXTRA Public Company Limited.
- Nawamin Office: Buengkum, Bangkok 10230, Thailand.
- By applying for this position, you consent to the collection, use and disclosure of your personal data to us, our recruitment firms and all relevant third parties for the purpose of processing your application for this job position (or any other suitable positions within Lotus's and its subsidiaries, if any). You understand and acknowledge that your personal data will be processed in accordance with the law and our policy. .
ประสบการณ์:
3 ปีขึ้นไป
ทักษะ:
English
ประเภทงาน:
งานประจำ
เงินเดือน:
สามารถต่อรองได้
- Responsible for planning preventive maintenance schedules for air condition & Fire protection systems.
- Responsible for coordinating and managing vendors and suppliers to preventive maintenance and payment plans.
- 2nd Level support to Data Center Operation (FOC), on site to solve Incident and Problem management.
- 2nd Level support to engineer team all site, Data Center (TT1, TT2, MTG, BNA).
- To create & update reports and documents to comply with ISO 20k, 22k, 27k, 50k & TCOS standards.
- Review PUE, cost saving energy and report.
- Measured Efficiency air system and record annual report.
- Responsible for implementation of Mechanical such as comfort air, precision air.
- Responsible for implementation of Fire suppression such as FM200, NOVEC, CO2, Fire Sprinkler, Fire Pump, Fire alarm, and Vesda.
- Working period office time 9:00 - 18:00 and able to standby on call or onsite to work on holiday.
- Bachelor degree of Engineering, Mechanical engineering or related field.
- At Least 3 years experience in maintenance air conditioning such as comfort air, precision air, chiller air cool, water cool, pump motor: implement and support for mechanics-air condition systems in buildings or Data Centers.
- At least 1 years experience in designing air conditioners such as comfort air, precision air, chiller air cool, water cool, pump motor: implement, and support for mechanics-air condition systems in building.
- Able to Air - Diagram and Psychrometric chart knowledge.
- Able to work as a team and work in and standby on call on holiday.
- Able to work overtime if required and a hotline arrives (Less than 1 hour on site from your home).
- Proficiency in English communication is beneficial.
- Work Location: TrueIDC - Bangna Site (KM26).
ทักษะ:
Finance, Financial Analysis
ประเภทงาน:
งานประจำ
เงินเดือน:
สามารถต่อรองได้
- This vacancy is to support new business expansion.
- Active Finance Business Partner (FP&A) in developing property investment strategy and execution: Mixed use project.
- Engage with senior management to understand the wider market trend and external factors which affect the investment.
- Lead and present financial feasibility and valuation of medium to large scale property projects to maximize return on investment.
- Be able to challenge key stakeholders for associated capex and opex investment in details.
- Perform post investment appraisal and provide insights and recommendation for improvement.
- Own the business planning cycle (budget, forecast, long term plan), understand key business drivers, risk and opportunities.
- Lead the continuous improvement of financial process and reporting and be able to leverage relevant technology and tool at work.
- Coach team and drive team effectiveness.
- Bachelor's degree or higher in business administration, finance, engineering, real estate.
- At least 5 years financial evaluation experience in mid to large scale property development.
- 7 year + finance experience in the real estate company/ mixed use project.
- Experience working with senior business stakeholders.
- Feasibility study and financial analysis skills.
- Real Estate Business acumen.
- Stakeholder management and Influencing skills.
- Strategic thinking and financial analysis skills.
- Good communication and presentation skill.
- Effective team management.
- CP AXTRA | Lotus's
- CP AXTRA Public Company Limited.
- Nawamin Office: Buengkum, Bangkok 10230, Thailand.
- By applying for this position, you consent to the collection, use and disclosure of your personal data to us, our recruitment firms and all relevant third parties for the purpose of processing your application for this job position (or any other suitable positions within Lotus's and its subsidiaries, if any). You understand and acknowledge that your personal data will be processed in accordance with the law and our policy.
ทักษะ:
Power BI, Excel, Problem Solving
ประเภทงาน:
งานประจำ
เงินเดือน:
สามารถต่อรองได้
- Following critical path, ensuring all activities meet the required deadlines.
- Transforming data into business insights.
- Lead analytical task by utilizing data analytical and Power BI skill.
- Coordinate cross-functional team (Commercial/Store operation) by convincing with data and reporting.
- Support and conduct meeting with Commercial senior leadership team to accomplish project and related task.
- Other assignments as it deems appropriate.
- Bachelor Degree or above in IT, IT Engineering, Logistics, Business Data, Marketing, Business Administration or related field.
- Experience of retail or supplier supply chain, or distribution operations.
- Background of drawing Planogram is a big plus.
- Good Computer skills, especially on MS Excel.
- Product knowledge (preferable).
- Cross-functional agility, and the ability to lead and meet objectives in a fast-paced, rapidly changing environment.
- Strong logical thinking, visual design, and presentation skills with exceptional attention to detail.
- Good analytical & problem solving skills, planning skills, numerical skills.
- Good attitude and self-motivated.
- CP AXTRA | Lotus's
- CP AXTRA Public Company Limited.
- Nawamin Office: Buengkum, Bangkok 10230, Thailand.
- By applying for this position, you consent to the collection, use and disclosure of your personal data to us, our recruitment firms and all relevant third parties for the purpose of processing your application for this job position (or any other suitable positions within Lotus's and its subsidiaries, if any). You understand and acknowledge that your personal data will be processed in accordance with the law and our policy. .
ประสบการณ์:
5 ปีขึ้นไป
ทักษะ:
Industry trends, Data Analysis, Project Management
ประเภทงาน:
งานประจำ
เงินเดือน:
สามารถต่อรองได้
- Drive and execute initiatives to improve operational efficiency, drive cost savings, and enhance retail productivity through innovation, process improvement, and streamlined operations supporting company s and top management s directions.
- Monitor and evaluate store performance, financial data, and in-store execution, providing insights and recommendations to the team for continuous development of store operations and process enhancements.
- Leverage global best practices from across the world to implement innovative solutio ...
- Apply business insights, industry trends, and data analysis to guide the team in translating information into actionable initiatives that drive process standardization and improved performance.
- Coordinate and collaborate with cross-functional teams to ensure alignment on project goals and strategies, helping to resolve issues and ensure smooth execution of initiatives.
- Oversee project management efforts, tracking progress and ensuring timely completion of tasks. Provide regular status updates, identify potential risks, and support the team in implementing mitigation plans when necessary.
- Prepare reports and presentations for senior management to communicate project progress, performance metrics, and key developments, offering insights and recommendations for effective decision-making.
- Lead and develop team members, fostering a entrepreneurial mindset and empowering them to identify opportunities for continuous improvement, cost savings, and operational excellence across the organization.
- Degree in Business, Economics, Engineering, or related field.
- 5 years+ working experience in process improvement, project management, quantitative analysis, or cost savings.
- Experience as a consultant for internal / external clients, or experience in Retail sector is a plus.
- Six Sigma Green Belt certification is a plus.
- Ability to analyze financial, operational, and performance data to generate actionable insights and recommendations.
- Familiarity with data analysis tools (e.g., Power BI, Excel) for analyzing data and generating performance reports.
- Skill in managing and facilitating process changes, ensuring that improvements are implemented effectively and embraced by the team.
- Proficiency in preparing reports and presentations that summarize data, analysis, and project updates for management.
- Strong ability to communicate clearly and persuasively with senior management, team members, and cross-functional partners.
- Ability to mentor and coach team members, developing a strong, capable team with a focus on business optimization.
- CP AXTRA | Lotus's
- CP AXTRA Public Company Limited.
- Nawamin Office: Buengkum, Bangkok 10230, Thailand.
- By applying for this position, you consent to the collection, use and disclosure of your personal data to us, our recruitment firms and all relevant third parties for the purpose of processing your application for this job position (or any other suitable positions within Lotus's and its subsidiaries, if any). You understand and acknowledge that your personal data will be processed in accordance with the law and our policy.
ประสบการณ์:
2 ปีขึ้นไป
ทักษะ:
Risk Management, Microsoft Office, Data Analysis
ประเภทงาน:
งานประจำ
เงินเดือน:
สามารถต่อรองได้
- Transaction Monitoring:Analyze transactions in real-time using fraud detection tools and rules.
- Identify suspicious activity based on pre-defined risk profiles and behavioral patterns.
- Investigate flagged transactions and determine their legitimacy.
- Escalate high-risk cases to the Fraud Management team for further investigation.
- Fraud Investigation:Gather and analyze evidence related to suspected fraudulent activity.
- Conduct research to identify potential fraud schemes and perpetrators.
- Document findings and recommend appropriate actions, such as blocking accounts, recovering funds, or reporting to law enforcement.
- Collaborate with internal teams (customer support, risk management) to resolve cases effectively and efficiently.
- Data Analysis & Reporting:Analyze fraud trends and patterns to identify emerging threats and adjust detection rules accordingly.
- Generate reports on fraud activity, providing insights to the Fraud Management team and senior management.
- Track and measure the effectiveness of fraud prevention and detection measures.
- Stay Informed:Stay up-to-date on the latest fraud threats, trends, and best practices.
- Participate in ongoing training and development opportunities to enhance your skills and knowledge.
- Minimum of 2-3 years of experience in fraud analysis or a related field.
- Strong analytical and problem-solving skills.
- Excellent attention to detail and ability to identify anomalies in data.
- Proficient in Microsoft Office Suite,SQL language and data analysis tools.
- Understanding of fraud detection and prevention techniques preferred.
- Effective communication and interpersonal skills.
- Ability to work independently and as part of a team.
- Bachelor's degree in business administration, finance, IT, engineering, or a related field preferred.
ประสบการณ์:
5 ปีขึ้นไป
ทักษะ:
Python, ETL, Compliance
ประเภทงาน:
งานประจำ
เงินเดือน:
สามารถต่อรองได้
- Design and implement scalable, reliable, and efficient data pipelines for ingesting, processing, and storing large amounts of data from a variety of sources using cloud-based technologies, Python, and PySpark.
- Build and maintain data lakes, data warehouses, and other data storage and processing systems on the cloud.
- Write and maintain ETL/ELT jobs and data integration scripts to ensure smooth and accurate data flow.
- Implement data security and compliance measures to protect data privacy and ensure regulatory compliance.
- Collaborate with data scientists and analysts to understand their data needs and provide them with access to the required data.
- Stay up-to-date on the latest developments in cloud-based data engineering, particularly in the context of Azure, AWS and GCP, and proactively bring new ideas and technologies to the team.
- Monitor and optimize the performance of data pipelines and systems, identifying and resolving any issues or bottlenecks that may arise.
- Bachelor s or Master s degree in Computer Science, Data Science, or a related field.
- Minimum of 5 years of experience as a Data Engineer, with a strong focus on cloud-based data infrastructure.
- Proficient programming skills in Python, Java, or a similar language, with an emphasis on Python.
- Extensive experience with cloud-based data storage and processing technologies, particularly Azure, AWS and GCP.
- Familiarity with ETL/ELT tools and frameworks such as Apache Beam, Apache Spark, or Apache Flink.
- Knowledge of data modeling principles and experience working with SQL databases.
- Strong problem-solving skills and the ability to troubleshoot and resolve issues efficiently.
- Excellent communication and collaboration skills to work effectively with cross-functional teams.
- Location: True Digital Park, Bangkok (Hybrid working).
ประสบการณ์:
8 ปีขึ้นไป
ทักษะ:
Scala, Java, Golang
ประเภทงาน:
งานประจำ
เงินเดือน:
สามารถต่อรองได้
- Lead the team technically in improving scalability, stability, accuracy, speed and efficiency of our existing Data systems.
- Build, administer and scale data processing pipelines.
- Be comfortable navigating the following technology stack: Scala, Spark, java, Golang, Python3, scripting (Bash/Python), Hadoop, SQL, S3 etc.
- Improve scalability, stability, accuracy, speed and efficiency of our existing data systems.
- Design, build, test and deploy new libraries, frameworks or full systems for our core systems while keeping to the highest standards of testing and code quality.
- Work with experienced engineers and product owners to identify and build tools to automate many large-scale data management / analysis tasks.
- What You'll need to Succeed.
- Bachelor's degree in Computer Science /Information Systems/Engineering/related field.
- 8+ years of experience in software and data engineering.
- Good experience in Apache Spark.
- Expert level understanding of JVM and either Java or Scala.
- Experience debugging and reasoning about production issues is desirable.
- A good understanding of data architecture principles preferred.
- Any other experience with Big Data technologies / tools.
- SQL experience.
- Analytical problem-solving capabilities & experience.
- Systems administration skills in Linux.
- It's great if you have.
- Good understanding of Hadoop ecosystems.
- Experience working with Open-source products.
- Python/Shell scripting skills.
- Working in an agile environment using test driven methodologies.
- Equal Opportunity Employer.
- At Agoda, we pride ourselves on being a company represented by people of all different backgrounds and orientations. We prioritize attracting diverse talent and cultivating an inclusive environment that encourages collaboration and innovation. Employment at Agoda is based solely on a person's merit and qualifications. We are committed to providing equal employment opportunity regardless of sex, age, race, color, national origin, religion, marital status, pregnancy, sexual orientation, gender identity, disability, citizenship, veteran or military status, and other legally protected characteristics.
- We will keep your application on file so that we can consider you for future vacancies and you can always ask to have your details removed from the file. For more details please read our privacy policy.
- To all recruitment agencies: Agoda does not accept third party resumes. Please do not send resumes to our jobs alias, Agoda employees or any other organization location. Agoda is not responsible for any fees related to unsolicited resumes.
ทักษะ:
Big Data, Java, Python
ประเภทงาน:
งานประจำ
เงินเดือน:
สามารถต่อรองได้
- Background in programming, databases and/or big data technologies OR.
- BS/MS in software engineering, computer science, economics or other engineering fields.
- Partner with Data Architect and Data Integration Engineer to enhance/maintain optimal data pipeline architecture aligned to published standards.
- Assemble medium, complex data sets that meet functional /non-functional business requirements.
- Design and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
- Build the infrastructure required for optimal extraction transformation, and loading of data from a wide variety of data sources big data technologies.
- Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics.
- Work with stakeholders including Domain leads, and Teams to assist with data-related technical issues and support their data infrastructure needs.
- Ensure technology footprint adheres to data security policies and procedures related to encryption, obfuscation and role based access.
- Create data tools for analytics and data scientist team members.
- Functional Competency.
- Knowledge of data and analytics framework supporting data lakes, warehouses, marts, reporting, etc.
- Defining data retention policies, monitoring performance and advising any necessary infrastructure changes based on functional and non-functional requirements.
- In depth knowledge of data engineering discipline.
- Extensive experience working with Big Data tools and building data solutions for advanced analytics.
- Minimum of 5+ years' hands-on experience with a strong data background.
- Solid programming skills in Java, Python and SQL.
- Clear hands-on experience with database systems - Hadoop ecosystem, Cloud technologies (e.g. AWS, Azure, Google), in-memory database systems (e.g. HANA, Hazel cast, etc) and other database systems - traditional RDBMS (e.g. Teradata, SQL Server, Oracle), and NoSQL databases (e.g. Cassandra, MongoDB, DynamoDB).
- Practical knowledge across data extraction and transformation tools - traditional ETL tools (e.g. Informatica, Ab Initio, Altryx) as well as more recent big data tools.
- Educational.
- Background in programming, databases and/or big data technologies OR.
- BS/MS in software engineering, computer science, economics or other engineering fields.
- Partner with Data Architect and Data Integration Engineer to enhance/maintain optimal data pipeline architecture aligned to published standards.
- Assemble medium, complex data sets that meet functional /non-functional business requirements.
- Design and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
- Build the infrastructure required for optimal extraction transformation, and loading of data from a wide variety of data sources big data technologies.
- Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics.
- Work with stakeholders including Domain leads, and Teams to assist with data-related technical issues and support their data infrastructure needs.
- Ensure technology footprint adheres to data security policies and procedures related to encryption, obfuscation and role based access.
- Create data tools for analytics and data scientist team members.
- Functional Competency.
- Knowledge of data and analytics framework supporting data lakes, warehouses, marts, reporting, etc.
- Defining data retention policies, monitoring performance and advising any necessary infrastructure changes based on functional and non-functional requirements.
- In depth knowledge of data engineering discipline.
- Extensive experience working with Big Data tools and building data solutions for advanced analytics.
- Minimum of 5+ years' hands-on experience with a strong data background.
- Solid programming skills in Java, Python and SQL.
- Clear hands-on experience with database systems - Hadoop ecosystem, Cloud technologies (e.g. AWS, Azure, Google), in-memory database systems (e.g. HANA, Hazel cast, etc) and other database systems - traditional RDBMS (e.g. Teradata, SQL Server, Oracle), and NoSQL databases (e.g. Cassandra, MongoDB, DynamoDB).
- Practical knowledge across data extraction and transformation tools - traditional ETL tools (e.g. Informatica, Ab Initio, Altryx) as well as more recent big data tools.
ทักษะ:
ETL, Big Data, SQL
ประเภทงาน:
งานประจำ
เงินเดือน:
สามารถต่อรองได้
- Design and develop ETL solutions using data integration tools for interfacing between source application and the Enterprise Data Warehouse.
- Experience with Big Data or data warehouse.
- Analyze & translate functional specifications & change requests into technical specifications.
- Experience in SQL programming in one of these RDBMSs such as Oracle.
- Develops ETL technical specifications, designs, develops, tests, implements, and supports optimal data solutions.
- Develops Documents ETL data mappings, data dictionaries, processes, programs and solutions as per established standards for data governance.
- Design and create codes for all related data extraction, transformation and loading (ETL) into database under responsibilities.
- Creates, executes, and documents unit test plans for ETL and data integration processes and programs.
- Perform problem assessment, resolution and documentation in existing ETL packages, mapping and workflows in production.
- Performance tuning of the ETL process and SQL queries and recommend and implement ETL and query tuning techniques.
- Qualifications Bachelor s degree in Information Technology, Computer Science, Computer Engineering, or a related field.
- Experience with CI/CD tools (e.g., Jenkins, GitLab CI/CD) and automation frameworks.
- Proficiency in cloud platforms such as AWS, Azure, and associated services.
- Knowledge of IAC tools like Terraform or Azure ARM.
- Familiarity with monitoring and logging tools like Prometheus, Grafana, ELK stack, APM, etc.
- Good understanding of IT Operations.
- Strong problem-solving skills and the ability to troubleshoot complex issues.
- Excellent communication and teamwork skills to collaborate effectively across various teams.
- We're committed to bringing passion and customer focus to the business. If you like wild growth and working with happy, enthusiastic over-achievers, you'll enjoy your career with us.
ประสบการณ์:
2 ปีขึ้นไป
ทักษะ:
Research, Python, SQL
ประเภทงาน:
งานประจำ
เงินเดือน:
สามารถต่อรองได้
- Develop machine learning models such as credit model, income estimation model and fraud model.
- Research on cutting-edge technology to enhance existing model performance.
- Explore and conduct feature engineering on existing data set (telco data, retail store data, loan approval data).
- Develop sentimental analysis model in order to support collection strategy.
- Bachelor Degree in Computer Science, Operations Research, Engineering, or related quantitative discipline.
- 2-5 years of experiences in programming languages such as Python, SQL or Scala.
- 5+ years of hands-on experience in building & implementing AI/ML solutions for senior role.
- Experience with python libraries - Numpy, scikit-learn, OpenCV, Tensorflow, Pytorch, Flask, Django.
- Experience with source version control (Git, Bitbucket).
- Proven knowledge on Rest API, Docker, Google Big Query, VScode.
- Strong analytical skills and data-driven thinking.
- Strong understanding of quantitative analysis methods in relation to financial institutions.
- Ability to clearly communicate modeling results to a wide range of audiences.
- Nice to have.
- Experience in image processing or natural language processing (NLP).
- Solid understanding in collection model.
- Familiar with MLOps concepts.
ทักษะ:
Data Analysis, SQL, Problem Solving, English
ประเภทงาน:
งานประจำ
เงินเดือน:
สามารถต่อรองได้
- Working closely with business and technical domain experts to identify data requirements that are relevant for analytics and business intelligence.
- Implement data solutions and data comprehensiveness for data customers.
- Working closely with engineering to ensure data service solutions are ultimately delivered in a timely and cost effective manner.
- Retrieve and prepare data (automated if possible) to support business data analysis.
- Ensure adherence to the highest standards in data privacy protection and data governance.
- Bachelor s of Master s Degree in Computer Science, Computer Engineering, or related.
- Minimum of 1 Year with relational/non-relational database systems and good command in SQL.
- Ability to meet critical deadlines and prioritize multiple tasks in a fast-paced environment.
- Ability to work independently, have strong problem solving and organization skills, with a high initiative and a sense of accountability and ownership.
- Experience with cloud-based platforms such as AWS, Google Cloud platform or similar.
- Experience in data processing batch / real time / nearly realtime.
- Experience with data integration or ETL management tools such as AWS Glue, Databrick,or similar.
- Experience with web or software development with Java,Python or similar.
- Experience with Agile methodology is a plus.
- Good in communication and writing in English.
- Good interpersonal and communication skills.
ประสบการณ์:
5 ปีขึ้นไป
ทักษะ:
AutoCAD, Visio, English
ประเภทงาน:
งานประจำ
เงินเดือน:
สามารถต่อรองได้
- Responsible for planning preventive maintenance schedules for the electrical system.
- Responsible for coordinating and managing vendors and suppliers to preventive maintenance and payment plans.
- 2nd Level support to Data Center Operation (FOC), on site to solve Incident and Problem management.
- 2nd Level support to engineer team all site, Data Center (TT1, TT2, MTG, BNA).
- To create & update reports and documents to comply with ISO 20k, 22k, 27k, 50k & TCOS standards.
- Review PUE, cost saving energy and report.
- Measured Efficiency air system and record annual report.
- Responsible for implementing Electrical such as MU, TR, MDB, GEN, UPS, RECT, BATT, ATS.
- Bachelor degree of Engineering, Electrical engineering or related field.
- More than 5 years of experience in maintenance of electrical systems such as RMU, TR, MDB, GEN, UPS, RECT, BATT, ATS: implement and support electrical systems in buildings or Data Centers.
- At least 1 years experience in designing electrical systems (such as RMU, TR, MDB, GEN, UPS, RECT, BATT, ATS). implement, and support for electrical systems in building.
- Able to use the program AutoCAD, Visio.
- Able to work as a team and work in and standby on call on holiday.
- Able to work overtime if required and a hotline arrives (Less than 1 hour on site from your home).
- Proficiency in English communication is beneficial for both reading and writing.
- Work Location: TrueIDC - Bangna Site (KM26).
- 1
- 2
- 3
- 4