- No elements found. Consider changing the search query.
Experience:
5 years required
Skills:
Python, ETL, Compliance
Job type:
Full-time
Salary:
negotiable
- Design and implement scalable, reliable, and efficient data pipelines for ingesting, processing, and storing large amounts of data from a variety of sources using cloud-based technologies, Python, and PySpark.
- Build and maintain data lakes, data warehouses, and other data storage and processing systems on the cloud.
- Write and maintain ETL/ELT jobs and data integration scripts to ensure smooth and accurate data flow.
- Implement data security and compliance measures to protect data privacy and ensure regulatory compliance.
- Collaborate with data scientists and analysts to understand their data needs and provide them with access to the required data.
- Stay up-to-date on the latest developments in cloud-based data engineering, particularly in the context of Azure, AWS and GCP, and proactively bring new ideas and technologies to the team.
- Monitor and optimize the performance of data pipelines and systems, identifying and resolving any issues or bottlenecks that may arise.
- Bachelor s or Master s degree in Computer Science, Data Science, or a related field.
- Minimum of 5 years of experience as a Data Engineer, with a strong focus on cloud-based data infrastructure.
- Proficient programming skills in Python, Java, or a similar language, with an emphasis on Python.
- Extensive experience with cloud-based data storage and processing technologies, particularly Azure, AWS and GCP.
- Familiarity with ETL/ELT tools and frameworks such as Apache Beam, Apache Spark, or Apache Flink.
- Knowledge of data modeling principles and experience working with SQL databases.
- Strong problem-solving skills and the ability to troubleshoot and resolve issues efficiently.
- Excellent communication and collaboration skills to work effectively with cross-functional teams.
- Location: True Digital Park, Bangkok (Hybrid working).
Skills:
ETL, Python, Java
Job type:
Full-time
Salary:
negotiable
- Design, develop, and maintain scalable data pipelines and ETL processes.
- Implement and optimize data storage solutions, including data warehouses and data lakes.
- Collaborate with data scientists and analysts to understand data requirements and provide efficient data access.
- Ensure data quality, consistency, and reliability across all data systems.
- Develop and maintain data models and schemas.
- Implement data security and access control measures.
- Optimize query performance and data retrieval processes.
- Evaluate and integrate new data technologies and tools.
- Mentor junior data engineers and provide technical leadership.
- Collaborate with cross-functional teams to support data-driven decision-making.
- RequirementsBachelor's or Master's degree in Computer Science, Engineering, or a related field.
- 5+ years of experience in data engineering or related roles.
- Strong programming skills in Python, Java, or Scala.
- Extensive experience with big data technologies such as Hadoop, Spark, and Hive.
- Proficiency in SQL and experience with both relational and NoSQL databases.
- Experience with cloud platforms (AWS, Azure, or GCP) and their data services.
- Knowledge of data modeling, data warehousing, and ETL best practices.
- Familiarity with data visualization tools (e.g., Tableau, Power BI).
- Experience with version control systems (e.g., Git) and CI/CD pipelines.
- Strong problem-solving skills and attention to detail.
- Excellent communication and collaboration skills.
Experience:
1 year required
Skills:
ETL, Data Analysis, Industry trends
Job type:
Full-time
Salary:
฿70,000 - ฿90,000, negotiable
- Analyze and interpret complex data sets to uncover insights and trends that drive business strategy and decision-making.
- Collaborate with cross-functional teams to understand their data needs and provide actionable recommendations.
- Design and maintain dashboards, reports, and visualizations using tools to communicate insights effectively.
- Extract data from various sources, including databases, APIs, and third-party services, ensuring data quality and accuracy.
- Develop and implement data models, ETL processes, and automated reporting solutions to streamline data analysis.
- Stay updated with industry trends and new technologies to enhance the company's data analytics capabilities.
- Participate in data governance initiatives, ensuring compliance with data privacy and security regulations.
- Bachelor's degree in Statistics, Data Science, or a related field; an MBA or advanced degree is a plus.
- Minimum of 5 years of experience in business intelligence or data analysis, preferably in a fast-paced e-commerce environment.
- Proficient in SQL and at least one data visualization tool (e.g., Tableau, Power BI), with a solid understanding of data warehousing concepts.
- Strong analytical skills, with the ability to manipulate, clean, and derive insights from large datasets.
- Effective communicator with excellent presentation skills, capable of translating complex data into simple, actionable insights for non-technical stakeholders.
Skills:
ETL, Java, Python
Job type:
Full-time
Salary:
negotiable
- Design, develop, optimize, and maintain data architecture and pipelines that adhere to ETL principles and business goals.
- Create data products for analytics and data scientist team members to improve their productivity.
- Lead the evaluation, implementation and deployment of emerging tools and process for analytic data engineering in order to improve our productivity as a team.
- Develop and deliver communication and education plans on analytic data engineering capabilities, standards, and processes.
- RequirementsPrevious experience as a data engineer or in a similar role.
- Technical expertise with data models, data mining, and segmentation techniques.
- Knowledge of programming languages (e.g. Java and Python).
- Hands-on experience with SQL database design using Hadoop or BigQuery and experience with a variety of relational, NoSQL, and cloud database technologies.
- Great numerical and analytical skill.
- Worked with BI tools such as Tableau, Power BI.
- Conceptual knowledge of data and analytics, such as dimensional modeling, ETL, reporting tools, data governance, data warehousing, structured and unstructured data.
Skills:
Automation, Product Owner, Python
Job type:
Full-time
Salary:
negotiable
- The candidate will be responsible for design, and implement new solutions for complex data ingestions from multiple sources to enterprise data products with focus on automation, performance, resilience and scalability, etc.
- Partner with Lead Architect, Data Product Manager (Product Owner) and Lead Data Integration Engineer to create strategic solutions introducing new technologies.
- Work with stakeholders including Management, Domain leads, and Teams to assist with data-related technical issues and support their data infrastructure needs.
- Strong development & programming experience in Informatica (IICS), Python, ADF, Azure Synapse, snowflake, Cosmos and Databricks.
- Solid understanding of databases, real-time integration patterns and ETL/ELT best practices.
- Defining data retention policies, monitoring performance and advising any necessary infrastructure changes based on functional and non-functional requirements.
- Responsible for ensuring enterprise data policies, best practices, standards and processes are followed.
- Write up and maintain technical specifications, design documents and process flow.
- Mentor a team of onshore and offshore development resources to analyze, design, construct and test software development projects focused on analytics and data integration.
- Elaborate user stories for technical team and ensure that the team understands the deliverables.
- Effectively communicate, coordinate & collaborate with business, IT architecture and data teams across multi-functional areas to complete deliverables.
- Provide direction to the Agile development team and stakeholders throughout the project.
- Assist in Data Architecture design, tool selection and data flows analysis.
- Work with large amounts of data, interpret data, analyze results, perform gap analysis and provide ongoing reports.
- Handle ad-hoc analysis & report generation requests from the business.
- Respond to data related inquiries to support business and technical teams.
- 6+years of proven working experience in ETL methodologies, Data integration and data migration. Informatica IICS, Databricks/Spark & Python hands-on development skills a must.
- Clear hands-on experience with database systems - SQL server, Oracle, Azure Synapse, Snowflake and Cosmos, Cloud technologies (e.g., AWS, Azure, Google), and NoSQL databases (e.g., Cosmos, MongoDB, DynamoDB).
- Extensive experience developing complex solutions focused on data ecosystem solutions.
- Extensive knowledge of data and analytics framework supporting data lakes, warehouses, marts, reporting, etc.
- In depth knowledge of data engineering and architecture disciplines.
- Extensive experience working with Big Data tools and building data solutions for advanced analytics.
- Solid understanding of P&C Insurance data.
- Technical expertise regarding data architecture, models and database design development.
- Strong knowledge of and experience with Java, SQL, XML s, Python, ETL frameworks and Databricks.
- Working knowledge/familiarity with Git version control.
- Strong Knowledge of analyzing datasets using Excel.
- Strong analytical skills with the ability to collect, organize, analyze, and disseminate significant amounts of information with attention to detail and accuracy.
- Proficient in learning new technologies with the ability to quickly understand capabilities and work with others to guide these into development.
- Good communication and presentation skills.
- Solid problem solving, decision making and analytical skills.
- Knowledge & working experience with Duck creek is an added plus.
- Knowledge & working experience with Insurity Policy Decisions and/or IEV is an added plus.
- Experience with JIRA.
- Experience being part of high-performance agile teams in a fast-paced environment.
- Must understand the system scope and project objectives to achieve project needs through matrix management and collaboration with other enterprise teams.
- Proven ability to produce results in the analysis, design, testing and deployment of applications.
- Strong team emphasis and relationship building skills; partners well with business and other IT/Data areas.
- Strong coaching / mentoring skills.
- Applies technical knowledge to determine solutions and solve complex problems.
- Ability to be proactive, self-motivated, detail-oriented, creative, inquisitive and persistent.
- Excellent communication and negotiation skills.
- Ability to organize, plan and implement work assignments, juggle competing demands and work under pressure of frequent and tight deadlines.
- The candidate will be responsible for design, and implement new solutions for complex data ingestions from multiple sources to enterprise data products with focus on automation, performance, resilience and scalability, etc.
- Partner with Lead Architect, Data Product Manager (Product Owner) and Lead Data Integration Engineer to create strategic solutions introducing new technologies.
- Work with stakeholders including Management, Domain leads, and Teams to assist with data-related technical issues and support their data infrastructure needs.
- Strong development & programming experience in Informatica (IICS), Python, ADF, Azure Synapse, snowflake, Cosmos and Databricks.
- Solid understanding of databases, real-time integration patterns and ETL/ELT best practices.
- Defining data retention policies, monitoring performance and advising any necessary infrastructure changes based on functional and non-functional requirements.
- Responsible for ensuring enterprise data policies, best practices, standards and processes are followed.
- Write up and maintain technical specifications, design documents and process flow.
- Mentor a team of onshore and offshore development resources to analyze, design, construct and test software development projects focused on analytics and data integration.
- Elaborate user stories for technical team and ensure that the team understands the deliverables.
- Effectively communicate, coordinate & collaborate with business, IT architecture and data teams across multi-functional areas to complete deliverables.
- Provide direction to the Agile development team and stakeholders throughout the project.
- Assist in Data Architecture design, tool selection and data flows analysis.
- Work with large amounts of data, interpret data, analyze results, perform gap analysis and provide ongoing reports.
- Handle ad-hoc analysis & report generation requests from the business.
- Respond to data related inquiries to support business and technical teams.
- 6+years of proven working experience in ETL methodologies, Data integration and data migration. Informatica IICS, Databricks/Spark & Python hands-on development skills a must.
- Clear hands-on experience with database systems - SQL server, Oracle, Azure Synapse, Snowflake and Cosmos, Cloud technologies (e.g., AWS, Azure, Google), and NoSQL databases (e.g., Cosmos, MongoDB, DynamoDB).
- Extensive experience developing complex solutions focused on data ecosystem solutions.
- Extensive knowledge of data and analytics framework supporting data lakes, warehouses, marts, reporting, etc.
- In depth knowledge of data engineering and architecture disciplines.
- Extensive experience working with Big Data tools and building data solutions for advanced analytics.
- Solid understanding of P&C Insurance data.
- Technical expertise regarding data architecture, models and database design development.
- Strong knowledge of and experience with Java, SQL, XML s, Python, ETL frameworks and Databricks.
- Working knowledge/familiarity with Git version control.
- Strong Knowledge of analyzing datasets using Excel.
- Strong analytical skills with the ability to collect, organize, analyze, and disseminate significant amounts of information with attention to detail and accuracy.
- Proficient in learning new technologies with the ability to quickly understand capabilities and work with others to guide these into development.
- Good communication and presentation skills.
- Solid problem solving, decision making and analytical skills.
- Knowledge & working experience with Duck creek is an added plus.
- Knowledge & working experience with Insurity Policy Decisions and/or IEV is an added plus.
- Experience with JIRA.
- Experience being part of high-performance agile teams in a fast-paced environment.
- Must understand the system scope and project objectives to achieve project needs through matrix management and collaboration with other enterprise teams.
- Proven ability to produce results in the analysis, design, testing and deployment of applications.
- Strong team emphasis and relationship building skills; partners well with business and other IT/Data areas.
- Strong coaching / mentoring skills.
- Applies technical knowledge to determine solutions and solve complex problems.
- Ability to be proactive, self-motivated, detail-oriented, creative, inquisitive and persistent.
- Excellent communication and negotiation skills.
- Ability to organize, plan and implement work assignments, juggle competing demands and work under pressure of frequent and tight deadlines.
Experience:
3 years required
Skills:
Big Data, Hive, SAS
Job type:
Full-time
Salary:
negotiable
- Design, implement, and maintain data analytics pipelines and processing systems.
- Experience of data modelling techniques and integration patterns.
- Write data transformation jobs through code.
- Analyze large datasets to extract insights and identify trends.
- Perform data management through data quality tests, monitoring, cataloging, and governance.
- Knowledge in data infrastructure ecosystem.
- Collaborate with cross-functional teams to identify opportunities to leverage data to drive business outcomes.
- Build data visualizations to communicate findings to stakeholders.
- A willingness to learn and find solutions to complex problems.
- Stay up-to-date with the latest developments in data analytics and science.
- Experience migrating from on-premise data stores to cloud solutions.
- Knowledge of system design and platform thinking to build sustainable solution.
- Practical experience with modern and traditional Big Data stacks (e.g BigQuery, Spark, Databricks, duckDB, Impala, Hive, etc).
- Experience working with data warehouse solutions ELT solutions, tools, and techniques (e.g. Airflow, dbt, SAS, Matillion, Nifi).
- Experience with agile software delivery and CI/CD processes.
- Bachelor's or Master's degree in computer science, statistics, engineering, or a related field.
- At least 3 years of experience in data analysis and modeling.
- Proficiency in Python, and SQL.
- Experience with data visualization tools such as Tableau, Grafana or similar.
- Familiarity with cloud computing platforms, such as GCP, AWS or Databricks.
- Strong problem-solving skills and the ability to work independently as well as collaboratively.
- This role offers a clear path to advance into machine learning and AI with data quality and management, providing opportunities to work on innovative projects and develop new skills in these exciting fields..
- Contact: [email protected] (K.Thipwimon).
- ท่านสามารถอ่านและศึกษานโยบายความเป็นส่วนตัวของธนาคารกรุงไทย จำกัด (มหาชน) ที่ https://krungthai.com/th/content/privacy-policy ทั้งนี้ ธนาคารไม่มีเจตนาหรือความจำเป็นใดๆ ที่จะประมวลผลข้อมูลส่วนบุคคลที่มีความอ่อนไหว รวมถึงข้อมูลที่เกี่ยวข้องศาสนาและ/หรือหมู่โลหิต ซึ่งอาจปรากฏอยู่ในสำเนาบัตรประจำตัวประชาชนของท่านแต่อย่างใด ดังนั้น กรุณาอย่าอัปโหลดเอกสารใดๆ รวมถึงสำเนาบัตรประจำตัวประชาชน หรือกรอกข้อมูลส่วนบุคคลที่มีความอ่อนไหวหรือข้อมูลอื่นใด ซึ่งไม่เกี่ยวข้องหรือไม่จำเป็นสำหรับวัตถุประสงค์ในการสมัครงานไว้บนเว็บไซต์ นอกจากนี้ กรุณาดำเนินการให้แน่ใจว่าได้ดำเนินการลบข้อมูลส่วนบุคคลที่มีความอ่อนไหว (ถ้ามี) ออกจากเรซูเม่และเอกสารอื่นใดก่อนที่จะอัปโหลดเอกสารดังกล่าวไว้บนเว็บไซต์แล้วด้วย ทั้งนี้ ธนาคารมีความจำเป็นต้องเก็บรวบรวมข้อมูลส่วนบุคคลเกี่ยวกับประวัติอาชญากรรมของท่านเพื่อบรรลุวัตถุประสงค์ในการพิจารณารับบุคคลเข้าทำงาน หรือการตรวจสอบคุณสมบัติ ลักษณะต้องห้าม หรือพิจารณาความเหมาะสมของบุคคลที่จะให้ดำรงตำแหน่ง ซึ่งการให้ความยินยอมเพื่อเก็บรวบรวม ใช้ หรือเปิดเผยข้อมูลส่วนบุคคลเกี่ยวกับประวัติอาชญากรรมของท่านมีความจำเป็นสำหรับการเข้าทำสัญญาและการได้รับการพิจารณาตามวัตถุประสงค์ดังกล่าวข้างต้น ในกรณีที่ท่านไม่ให้ความยินยอมในการเก็บรวบรวม ใช้ หรือเปิดเผยข้อมูลส่วนบุคคลเกี่ยวกับประวัติอาชญากรรม หรือมีการถอนความยินยอมในภายหลัง ธนาคารอาจไม่สามารถดำเนินการเพื่อบรรลุวัตถุประสงค์ดังกล่าวข้างต้นได้ และอาจ ทำให้ท่านสูญเสียโอกาสในการได้รับการพิจารณารับเข้าทำงานกับธนาคาร .
Skills:
ETL, SQL, Hadoop
Job type:
Full-time
Salary:
negotiable
- Conduct meeting with users to understand the data requirements and perform database design based on data understanding and requirements with consideration for performance.
- Maintain data dictionary, relationship and its interpretation.
- Analyze problem and find resolution, as well as work closely with administrators to monitor performance and advise any necessary infrastructure changes.
- Work with business domain experts, data scientists and application developers to identify data that is relevant for analysis.
- Develop big data solutions for batch processing and near real-time streaming.
- Own end-to-end data ETL/ELT process framework from Data Source to Data warehouse.
- Select and integrate appropriate tools and frameworks required to provide requested capabilities.
- Design and develop BI solutions.
- Hands-on development mentality, with a willingness to troubleshoot and solve complex problems.
- Keep abreast of new developments in the big data ecosystem and learn new technologies.
- Ability to effectively work independently and handle multiple priorities.
- Master degree or higher in Computer Science, Computer Engineering, Information Technology, Management Information System or an IT related field.
- 3+ year's experience in Data Management or Data Engineer (Retail or E-Commerce business is preferable).
- Expert experience in query language (SQL), Databrick SQL, PostgreSQL.
- Experience in BigData Technologies like Hadoop, Apache Spark, Databrick.
- Experience in Python is a must.
- Knowledge in machine/statistical learning, data mining is a plus.
- Experience in Generative AI is a plus.
- Strong analytical, problem solving, communication and interpersonal skills.
- Having good attitude toward team working and willing to work hard.
- CP AXTRA | Lotus's
- CP AXTRA Public Company Limited.
- Nawamin Office: Buengkum, Bangkok 10230, Thailand.
- By applying for this position, you consent to the collection, use and disclosure of your personal data to us, our recruitment firms and all relevant third parties for the purpose of processing your application for this job position (or any other suitable positions within Lotus's and its subsidiaries, if any). You understand and acknowledge that your personal data will be processed in accordance with the law and our policy. .
Experience:
3 years required
Skills:
Big Data, ETL, SQL, Python
Job type:
Full-time
Salary:
negotiable
- Develop and maintain robust data pipelines to ingest, process, and transform raw data into formats suitable for LLM training.
- Conduct meeting with users to understand the data requirements and perform database design based on data understanding and requirements with consideration for performance.
- Maintain data dictionary, relationship and its interpretation.
- Analyze problem and find resolution, as well as work closely with administrators to monitor performance and advise any necessary infrastructure changes.
- Work with business domain experts, data scientists and application developers to identify data that is relevant for analysis.
- Develop big data solutions for batch processing and near real-time streaming.
- Own end-to-end data ETL/ELT process framework from Data Source to Data warehouse.
- Select and integrate appropriate tools and frameworks required to provide requested capabilities.
- Design and develop BI solutions.
- Hands-on development mentality, with a willingness to troubleshoot and solve complex problems.
- Keep abreast of new developments in the big data ecosystem and learn new technologies.
- Ability to effectively work independently and handle multiple priorities.
- Bachelor degree or higher in Computer Science, Computer Engineering, Information Technology, Management Information System or an IT related field.
- 3+ year's experiences in Data Management or Data Engineer (Retail or E-Commerce business is preferrable).
- Expert experience in query language (SQL), Databrick SQL, PostgreSQL.
- Experience in Big Data Technologies like Hadoop, Apache Spark, Databrick.
- Experience in Python is a must.
- Experience in Generative AI is a must.
- Knowledge in machine/statistical learning, data mining is a plus.
- Strong analytical, problem solving, communication and interpersonal skills.
- Having good attitude toward team working and willing to work hard.
- CP AXTRA | Lotus's
- CP AXTRA Public Company Limited.
- Nawamin Office: Buengkum, Bangkok 10230, Thailand.
- By applying for this position, you consent to the collection, use and disclosure of your personal data to us, our recruitment firms and all relevant third parties for the purpose of processing your application for this job position (or any other suitable positions within Lotus's and its subsidiaries, if any). You understand and acknowledge that your personal data will be processed in accordance with the law and our policy. .
Experience:
No experience required
Skills:
Mechanical Engineering, Electrical Engineering, English
Job type:
Full-time
- Provide day to day installation, maintenance, and repair of all facilities in the data center.
- 24x7 shift work responsibility when qualified and designated.
- Provide requested reporting and documentation.
- Support of facility, development, and construction teams.
- Perform tasks as assigned by DC operation manager.
- Respond to customer requests, power, cooling, and facility audits.
- First tier investigate any power, communication, or cooling anomalies.
- Attend assigned meetings and training.
- Assist in ensuring customer compliance with GSA Acceptance Usage Policy (AUP).
- Provide technical escort when needed.
- Job Qualifications.
- Must be familiar with safety requirements and OSHA regulations or Thailand safety regulations.
- Basic understanding of electrical and mechanical systems that may be employed in a data center environment. This may include electrical feeders, transformers, generators, switchgear, UPS systems, DC power systems, ATS/STS units, PDU units, air handling units, cooling towers, and fire suppression systems.
- Familiar with Interpret wiring diagrams, schematics, and electrical drawings.
- Ability to express ideas clearly, concisely, and effectively with contractors performing maintenance or upgrades on systems installed in the data center environment.
- Excellent verbal, written, and interpersonal communication skills.
- Ability to analyze and make suggestions for problem resolution.
- Solve problems with good initiative and sound judgment.
- Creativity, problem solving skills, negotiation and systematic thinking.
- Fluent in English both written and verbal (Minimum 500 TOEIC score).
- Goal-Oriented, Unity, Learning, Flexible.
Experience:
3 years required
Skills:
English
Job type:
Full-time
Salary:
negotiable
- Responsible for planning preventive maintenance schedules for air condition & Fire protection systems.
- Responsible for coordinating and managing vendors and suppliers to preventive maintenance and payment plans.
- 2nd Level support to Data Center Operation (FOC), on site to solve Incident and Problem management.
- 2nd Level support to engineer team all site, Data Center (TT1, TT2, MTG, BNA).
- To create & update reports and documents to comply with ISO 20k, 22k, 27k, 50k & TCOS standards.
- Review PUE, cost saving energy and report.
- Measured Efficiency air system and record annual report.
- Responsible for implementation of Mechanical such as comfort air, precision air.
- Responsible for implementation of Fire suppression such as FM200, NOVEC, CO2, Fire Sprinkler, Fire Pump, Fire alarm, and Vesda.
- Working period office time 9:00 - 18:00 and able to standby on call or onsite to work on holiday.
- Bachelor degree of Engineering, Mechanical engineering or related field.
- At Least 3 years experience in maintenance air conditioning such as comfort air, precision air, chiller air cool, water cool, pump motor: implement and support for mechanics-air condition systems in buildings or Data Centers.
- At least 1 years experience in designing air conditioners such as comfort air, precision air, chiller air cool, water cool, pump motor: implement, and support for mechanics-air condition systems in building.
- Able to Air - Diagram and Psychrometric chart knowledge.
- Able to work as a team and work in and standby on call on holiday.
- Able to work overtime if required and a hotline arrives (Less than 1 hour on site from your home).
- Proficiency in English communication is beneficial.
- Work Location: TrueIDC - Bangna Site (KM26).
Skills:
Software Development, PHP, Golang, English
Job type:
Full-time
Salary:
negotiable
- Lead the design and implementation of high-quality software applications, ensuring best practices are followed.
- Collaborate with cross-functional teams to define, design, and deliver new features and enhancements.
- Mentor and guide junior engineers, fostering their technical development and growth.
- Conduct thorough code reviews to maintain high coding standards and ensure overall code quality.
- Optimize application performance and scalability, identifying opportunities for improvement.
- Design system architecture with a focus on security and adherence to programming standards.
- Solve complex technical challenges and provide strategic, scalable solutions.
- Bachelor s degree in Computer Science, Software Engineering, or a related field.
- 3 years plus of experience in software development.
- A Master s degree or additional certifications in relevant areas is a plus.
- Programming Language Proficiency: Strong expertise in PHP, Golang, NodeJS, and TypeScript.
- Experience with Programming Frameworks: Proficient in Go-Fiber, Go-Gin, ReactJS, NextJS, AngularJS, Laravel, and CodeIgniter.
- Database Experience: Hands-on experience with databases such as MongoDB, MariaDB, MySQL, and PostgreSQL.
- Strong understanding of data structures and algorithms.
- Expertise in system architecture design and development.
- In-depth knowledge of security programming standards and best practices.
- Advanced technical problem-solving abilities, with a proven ability to address complex issues.
- Possesses a positive attitude and participates in team-building and events.
- Comfortable presenting technical information and project updates to both technical and non-technical stakeholders.
- Skilled in using AI to solve complex problems, leading to improved outcomes.
- Be able to communicate in both Thai and English.
- Experience with reactive programming techniques and frameworks.
- Knowledge of cloud computing environments and microservices architecture design and implementation.
- Familiarity with DevOps practices and tools, including continuous integration and deployment processes.
- Remark: Given the nature of the mentioned position, where employees are involved with customer data and asset values, and/or the company, to comply with legal and regulatory standards established by the Securities and Exchange Commission, as well as to align with laws and overseeing agencies, the company requires a criminal background check as part of the post-interview process before joining the company. Your criminal history information will be retained for a period of 6 months from the start date.
- Important: https://careers.bitkub.com/privacy.
Skills:
Cloud Computing, RESTful, JSON
Job type:
Full-time
Salary:
negotiable
- Bachelor or master s degree in computer and Telecommunication Engineering, Computer Science, IT or in a related field.
- 8 - 13 years of experience in the Computer or Telecommunication field.
- Good Knowledge on cloud computing & edge computing technology.
- Good understanding on infrastructure technic that related of TCP/IP, Switch, Router, Firewall, LBS, and DNS.
- Good understanding technic that related of IoT/M2M/MEC Network Protocols - HTTP, HTTPS, Restful, MQTT, COAP, JSON objects, API, SNMP.
- Operating System knowledge: Linux-Redhat, CenOS, Windows Server.
- Database knowledge - Mongo DB, NoSQL DB, SQL, PostgreSQL.
- Good understanding of Docker and Kubernetes operations.
Skills:
Procurement
Job type:
Full-time
Salary:
negotiable
- Design & Develop solution to cover all required FBB & WIFI core network area and fulfill business and service requirement for Consumer, SME, Enterprise & FMC.
- Determine cost structure and propose best practice investment efficiency and control investment within assigned annual budget.
- Engage in exploration on new technology FBB & WIFI core network, data center and IT related system, which can lead to fulfill business requirements, evaluation and short listed for future procurement.
- Design and Develop network planning and operation tool to digitalize planning & operation process.
- Bachelor or higher degree in Computer, IT, or Telecom Engineering.
- At least 3-10 years experience in Mobile operator, Broadband network company especially in core network domain.
- Strong knowledge in Core Network (MPLS, BNG, DPI,CGN, DHCP, AAA) & IT System Infrastructure (Switch, Load Balance, Firewall/WAF, Server, Storage) Design/Planning.
Job type:
Full-time
Salary:
negotiable
- การกำกับดูแลข้อมูล (Data Governance)
- นำการดำเนินงานด้านกรอบการกำกับดูแลข้อมูล นโยบาย และกระบวนการที่กำหนดโดยองค์กร
- ทำงานร่วมกับเจ้าของข้อมูลและผู้ดูแลข้อมูลเพื่อกำหนดมาตรฐานข้อมูล แนวปฏิบัติในการจัดการข้อมูล และแนวทางการใช้งานข้อมูล
- ตรวจสอบให้สอดคล้องกับข้อกำหนดด้านกฎหมายและมาตรฐานที่เกี่ยวข้อง (เช่น GDPR, PDPA).
- การจัดการคุณภาพของข้อมูล (Data Quality Management)
- กำหนดและติดตามตัวชี้วัดคุณภาพข้อมูล (เช่น ความถูกต้อง ความครบถ้วน ความทันเวลา)
- ใช้เครื่องมือและกระบวนการจัดการคุณภาพข้อมูลเพื่อแก้ไขความผิดพลาดและพัฒนาความน่าเชื่อถือของข้อมูล
- ประสานงานกับหน่วยงานธุรกิจเพื่อแก้ปัญหาและป้องกันปัญหาคุณภาพข้อมูลในอนาคต.
- การจัดการเมตาดาต้าและข้อมูลหลัก (Metadata and Master Data Management)
- พัฒนาและดูแลคลังเมตาดาต้าและพจนานุกรมข้อมูลขององค์กร
- ดูแลการจัดการข้อมูลหลัก (Master Data) เพื่อให้ข้อมูลมีความสอดคล้องในระบบและกระบวนการ.
- การทำงานร่วมกับผู้มีส่วนได้ส่วนเสีย
- เป็นตัวกลางระหว่างหน่วยงานธุรกิจ ทีมไอที และทีมกำกับดูแล เพื่อสร้างวัฒนธรรมความรับผิดชอบต่อข้อมูล
- เป็นผู้นำในการจัดประชุมคณะกรรมการกำกับดูแลข้อมูลและกลุ่มทำงาน
- จัดอบรมและให้การสนับสนุนแก่ผู้มีส่วนได้ส่วนเสีย เพื่อยกระดับความเข้าใจในข้อมูลและปฏิบัติตามแนวทางการกำกับดูแล.
- การบริหารจัดการความเสี่ยงและการปฏิบัติตามกฎระเบียบ
- ระบุความเสี่ยงที่เกี่ยวข้องกับข้อมูลและเสนอแนวทางการแก้ไข
- ตรวจสอบให้การใช้งานข้อมูลสอดคล้องกับเป้าหมายขององค์กร กฎหมาย และมาตรฐานทางจริยธรรม
- เป็นผู้นำในการตรวจสอบและประเมินผลด้านการกำกับดูแลข้อมูล.
- มีประสบการณ์ทำงานที่เกี่ยวข้องกับการกำกับดูแลและการจัดการคุณภาพของข้อมูล.
- ปริญญาตรีสาขาบริหารธุรกิจ วิทยาการคอมพิวเตอร์ คอมพิวเตอร์ธุรกิจ เทคโนโลยีสารสนเทศ หรือ สาขาอื่น ๆ ที่เกี่ยวข้อง..
- ติดต่อสอบถาม.
- สำนักทรัพยากรบุคคล.
- บริษัท ไทยเบฟเวอเรจ จำกัด (มหาชน).
- อาคารเล้าเป้งง้วน 1 333 ถนน วิภาวดีรังสิต จอมพล เขตจตุจักร กรุงเทพมหานคร 10900.
Skills:
Power BI, Excel, Problem Solving
Job type:
Full-time
Salary:
negotiable
- Following critical path, ensuring all activities meet the required deadlines.
- Transforming data into business insights.
- Lead analytical task by utilizing data analytical and Power BI skill.
- Coordinate cross-functional team (Commercial/Store operation) by convincing with data and reporting.
- Support and conduct meeting with Commercial senior leadership team to accomplish project and related task.
- Other assignments as it deems appropriate.
- Bachelor Degree or above in IT, IT Engineering, Logistics, Business Data, Marketing, Business Administration or related field.
- Experience of retail or supplier supply chain, or distribution operations.
- Background of drawing Planogram is a big plus.
- Good Computer skills, especially on MS Excel.
- Product knowledge (preferable).
- Cross-functional agility, and the ability to lead and meet objectives in a fast-paced, rapidly changing environment.
- Strong logical thinking, visual design, and presentation skills with exceptional attention to detail.
- Good analytical & problem solving skills, planning skills, numerical skills.
- Good attitude and self-motivated.
- CP AXTRA | Lotus's
- CP AXTRA Public Company Limited.
- Nawamin Office: Buengkum, Bangkok 10230, Thailand.
- By applying for this position, you consent to the collection, use and disclosure of your personal data to us, our recruitment firms and all relevant third parties for the purpose of processing your application for this job position (or any other suitable positions within Lotus's and its subsidiaries, if any). You understand and acknowledge that your personal data will be processed in accordance with the law and our policy. .
Skills:
Data Analysis, SQL, Excel, English
Job type:
Full-time
Salary:
negotiable
- Data Analysis: Conduct in-depth analysis of retail and wholesale business data to address specific business questions and challenges.
- Insight Generation: Interpret results from dashboards and data analyses to develop actionable insights and strategic recommendations.
- Requirement Gathering: Identify business problems, gather requirements, and propose potential solutions, including leveraging AI to enhance business operations.
- ML Model creations: Create data analytic model including both deterministic and machine learning model.
- AI vendors coordination's: Collaborate with external AI suppliers to align project objectives with technological capabilities.
- Cross-Departmental Collaboration: Work with various departments to develop and implement data-driven strategies that optimize business processes and decision-making.
- Communication: Act as a liaison between stakeholders and AI vendors, ensuring clear communication understanding of project requirements.
- Data analytics and AI Strategy Design; Design and recommend how Business Intelligence (BI) and AI technologies can address business problems and provide further insights.
- Decision making support: Present key findings from own analysis and strategic recommendations to business counterparts and senior management, focusing on project approaches and strategic planning.
- Bachelors' Degree or higher in Computer Science, Engineering, Information Technology, Management Information System.
- Strong business acumen, with a deep understanding of retail and wholesale business.
- At least 5 years of proven experience as a data analytic role (Retail or E-Commerce Business is preferable).
- Hands-on Experience in SQL, data cloud platform (e.g. Databricks, Snowflake, GCP, or AWS) and high proficiency in excel.
- Good knowledge of Statistics.
- Experience in Python (Pandas, Numpy, SparkSQL), Data Visualization (Tableau, Power BI) is a plus.
- Excellent communication skills with a ability to convey complex findings to non-technical stakeholders.
- Fluent in Thai and English.
- Having good attitude toward team working & willing to work hard.
Skills:
Project Management, SQL, Python, English
Job type:
Full-time
Salary:
negotiable
- Adjust language models that have already been trained for generative AI applications Ensure that the LLMs and pipelines based on LLMs are tuned and released.
- Create and implement LLMs for various content creation jobs Develop and communicate roadmaps for data science projects.
- Design effective agile workflows and manage a cycle of deliverables that meet timeline and resource constraints.
- Serve as a bridge between stakeholders and AI suppliers to facilitate seamless communication and understanding of project requirements.
- Work closely with external AI suppliers to ensure alignment between project goals and technological capabilities.
- Identify and gather data sets necessary for AI projects.
- Prior experience in Machine Learning, Deep Learning, and AI algorithm to solve respective business cases and pain points.
- Prior hands-on experience in data-mining techniques to better understand each pain point and provide insights.
- Able to design and conduct analysis to support product & channel improvement and development.
- Present key findings and recommendations to business counter parties and senior management on project approach and strategic planning.
- Bachelor degree or higher in Computer Science, Computer Engineering, Information Technology, Management Information System or an IT related field.
- Native Thai speaker & fluent in English.
- 3+ years of proven experience as a Data Scientist with a focus on project management (Retail or E-Commerce business is preferable).
- At least 2+ years of relevant experience as an LLM Data Scientist Experience in SQL and Python (Pandas, Numpy, SparkSQL).
- Ability to manipulate and analyze complex, high-volume, high-dimensionality data from varying sources.
- Experience in Big Data Technologies like Hadoop, Apache Spark, Databrick.
- Experience in machine learning and deep learning (Tensorflow, Keras, Scikit-learn).
- Good Knowledge of Statistics.
- Experience in Data Visualization (Tableau, PowerBI) is a plus.
- Excellent communication skills with the ability to convey complex findings to non-technical stakeholders.
- Having good attitude toward team working and willing to work hard.
Experience:
2 years required
Skills:
Research, Python, SQL
Job type:
Full-time
Salary:
negotiable
- Develop machine learning models such as credit model, income estimation model and fraud model.
- Research on cutting-edge technology to enhance existing model performance.
- Explore and conduct feature engineering on existing data set (telco data, retail store data, loan approval data).
- Develop sentimental analysis model in order to support collection strategy.
- Bachelor Degree in Computer Science, Operations Research, Engineering, or related quantitative discipline.
- 2-5 years of experiences in programming languages such as Python, SQL or Scala.
- 5+ years of hands-on experience in building & implementing AI/ML solutions for senior role.
- Experience with python libraries - Numpy, scikit-learn, OpenCV, Tensorflow, Pytorch, Flask, Django.
- Experience with source version control (Git, Bitbucket).
- Proven knowledge on Rest API, Docker, Google Big Query, VScode.
- Strong analytical skills and data-driven thinking.
- Strong understanding of quantitative analysis methods in relation to financial institutions.
- Ability to clearly communicate modeling results to a wide range of audiences.
- Nice to have.
- Experience in image processing or natural language processing (NLP).
- Solid understanding in collection model.
- Familiar with MLOps concepts.
Skills:
Research, Risk Management, Microsoft Office
Job type:
Full-time
Salary:
negotiable
- Transaction Monitoring:Analyze transactions in real-time using fraud detection tools and rules.
- Identify suspicious activity based on pre-defined risk profiles and behavioral patterns.
- Investigate flagged transactions and determine their legitimacy.
- Escalate high-risk cases to the Fraud Management team for further investigation.
- Fraud Investigation:Gather and analyze evidence related to suspected fraudulent activity.
- Conduct research to identify potential fraud schemes and perpetrators.
- Document findings and recommend appropriate actions, such as blocking accounts, recovering funds, or reporting to law enforcement.
- Collaborate with internal teams (customer support, risk management) to resolve cases effectively and efficiently.
- Data Analysis & Reporting:Analyze fraud trends and patterns to identify emerging threats and adjust detection rules accordingly.
- Generate reports on fraud activity, providing insights to the Fraud Management team and senior management.
- Track and measure the effectiveness of fraud prevention and detection measures.
- Stay Informed:Stay up-to-date on the latest fraud threats, trends, and best practices.
- Participate in ongoing training and development opportunities to enhance your skills and knowledge.
- Minimum of 2-3 years of experience in fraud analysis or a related field.
- Strong analytical and problem-solving skills.
- Excellent attention to detail and ability to identify anomalies in data.
- Proficient in Microsoft Office Suite,SQL language and data analysis tools.
- Understanding of fraud detection and prevention techniques preferred.
- Effective communication and interpersonal skills.
- Ability to work independently and as part of a team.
- Bachelor's degree in business administration, finance, IT, engineering, or a related field preferred.
Skills:
Data Analysis, SQL, Problem Solving, English
Job type:
Full-time
Salary:
negotiable
- Working closely with business and technical domain experts to identify data requirements that are relevant for analytics and business intelligence.
- Implement data solutions and data comprehensiveness for data customers.
- Working closely with engineering to ensure data service solutions are ultimately delivered in a timely and cost effective manner.
- Retrieve and prepare data (automated if possible) to support business data analysis.
- Ensure adherence to the highest standards in data privacy protection and data governance.
- Bachelor s of Master s Degree in Computer Science, Computer Engineering, or related.
- Minimum of 1 Year with relational/non-relational database systems and good command in SQL.
- Ability to meet critical deadlines and prioritize multiple tasks in a fast-paced environment.
- Ability to work independently, have strong problem solving and organization skills, with a high initiative and a sense of accountability and ownership.
- Experience with cloud-based platforms such as AWS, Google Cloud platform or similar.
- Experience in data processing batch / real time / nearly realtime.
- Experience with data integration or ETL management tools such as AWS Glue, Databrick,or similar.
- Experience with web or software development with Java,Python or similar.
- Experience with Agile methodology is a plus.
- Good in communication and writing in English.
- Good interpersonal and communication skills.
- 1
- 2
- 3
- 4