- No elements found. Consider changing the search query.
ประเภทงาน:
งานประจำ
เงินเดือน:
สามารถต่อรองได้
- Design and implement the methods for storing and retrieving the data and monitoring the data pipelines, starting from the ingestion of raw data sources, transforming, cleaning and storing, enrich the data for promptly using for both of structured and unstructured data, and working with the data lake and cloud which is the big data technology.
- Develop data pipeline automation using Azure technologies, Databricks and Data Factory
- Understand data, reports and dashboards requirements, develop data visualization using Power BI, Tableau by working across workstreams to support data requirements including reports and dashboards and collaborate with data scientists, data analyst, data governance team or business stakeholders on several projects
- Analyze and perform data profiling to understand data patterns following Data Quality and Data Management processes
- 3 years+ experience in big data technology, data engineering, data analytic application system development.
- Have an experience of unstructured data for business Intelligence or computer science would be advantage.
- Technical skill in SQL, UNIX and Shell Script, Python, R Programming, Spark, Hadoop programming.
ทักษะ:
ETL, Java, Python
ประเภทงาน:
งานประจำ
เงินเดือน:
สามารถต่อรองได้
- Design, develop, optimize, and maintain data architecture and pipelines that adhere to ETL principles and business goals.
- Create data products for analytics and data scientist team members to improve their productivity.
- Lead the evaluation, implementation and deployment of emerging tools and process for analytic data engineering in order to improve our productivity as a team.
- Develop and deliver communication and education plans on analytic data engineering capabilities, standards, and processes.
- RequirementsPrevious experience as a data engineer or in a similar role.
- Technical expertise with data models, data mining, and segmentation techniques.
- Knowledge of programming languages (e.g. Java and Python).
- Hands-on experience with SQL database design using Hadoop or BigQuery and experience with a variety of relational, NoSQL, and cloud database technologies.
- Great numerical and analytical skill.
- Worked with BI tools such as Tableau, Power BI.
- Conceptual knowledge of data and analytics, such as dimensional modeling, ETL, reporting tools, data governance, data warehousing, structured and unstructured data.
ประสบการณ์:
3 ปีขึ้นไป
ทักษะ:
English
ประเภทงาน:
งานประจำ
เงินเดือน:
สามารถต่อรองได้
- Responsible for planning preventive maintenance schedules for air condition & Fire protection systems.
- Responsible for coordinating and managing vendors and suppliers to preventive maintenance and payment plans.
- 2nd Level support to Data Center Operation (FOC), on site to solve Incident and Problem management.
- 2nd Level support to engineer team all site, Data Center (TT1, TT2, MTG, BNA).
- To create & update reports and documents to comply with ISO 20k, 22k, 27k, 50k & TCOS standards.
- Review PUE, cost saving energy and report.
- Measured Efficiency air system and record annual report.
- Responsible for implementation of Mechanical such as comfort air, precision air.
- Responsible for implementation of Fire suppression such as FM200, NOVEC, CO2, Fire Sprinkler, Fire Pump, Fire alarm, and Vesda.
- Working period office time 9:00 - 18:00 and able to standby on call or onsite to work on holiday.
- Bachelor degree of Engineering, Mechanical engineering or related field.
- At Least 3 years experience in maintenance air conditioning such as comfort air, precision air, chiller air cool, water cool, pump motor: implement and support for mechanics-air condition systems in buildings or Data Centers.
- At least 1 years experience in designing air conditioners such as comfort air, precision air, chiller air cool, water cool, pump motor: implement, and support for mechanics-air condition systems in building.
- Able to Air - Diagram and Psychrometric chart knowledge.
- Able to work as a team and work in and standby on call on holiday.
- Able to work overtime if required and a hotline arrives (Less than 1 hour on site from your home).
- Proficiency in English communication is beneficial.
- Work Location: TrueIDC - Bangna Site (KM26).
ทักษะ:
Automation, Product Owner, Python
ประเภทงาน:
งานประจำ
เงินเดือน:
สามารถต่อรองได้
- The candidate will be responsible for design, and implement new solutions for complex data ingestions from multiple sources to enterprise data products with focus on automation, performance, resilience and scalability, etc.
- Partner with Lead Architect, Data Product Manager (Product Owner) and Lead Data Integration Engineer to create strategic solutions introducing new technologies.
- Work with stakeholders including Management, Domain leads, and Teams to assist with data-related technical issues and support their data infrastructure needs.
- Strong development & programming experience in Informatica (IICS), Python, ADF, Azure Synapse, snowflake, Cosmos and Databricks.
- Solid understanding of databases, real-time integration patterns and ETL/ELT best practices.
- Defining data retention policies, monitoring performance and advising any necessary infrastructure changes based on functional and non-functional requirements.
- Responsible for ensuring enterprise data policies, best practices, standards and processes are followed.
- Write up and maintain technical specifications, design documents and process flow.
- Mentor a team of onshore and offshore development resources to analyze, design, construct and test software development projects focused on analytics and data integration.
- Elaborate user stories for technical team and ensure that the team understands the deliverables.
- Effectively communicate, coordinate & collaborate with business, IT architecture and data teams across multi-functional areas to complete deliverables.
- Provide direction to the Agile development team and stakeholders throughout the project.
- Assist in Data Architecture design, tool selection and data flows analysis.
- Work with large amounts of data, interpret data, analyze results, perform gap analysis and provide ongoing reports.
- Handle ad-hoc analysis & report generation requests from the business.
- Respond to data related inquiries to support business and technical teams.
- 6+years of proven working experience in ETL methodologies, Data integration and data migration. Informatica IICS, Databricks/Spark & Python hands-on development skills a must.
- Clear hands-on experience with database systems - SQL server, Oracle, Azure Synapse, Snowflake and Cosmos, Cloud technologies (e.g., AWS, Azure, Google), and NoSQL databases (e.g., Cosmos, MongoDB, DynamoDB).
- Extensive experience developing complex solutions focused on data ecosystem solutions.
- Extensive knowledge of data and analytics framework supporting data lakes, warehouses, marts, reporting, etc.
- In depth knowledge of data engineering and architecture disciplines.
- Extensive experience working with Big Data tools and building data solutions for advanced analytics.
- Solid understanding of P&C Insurance data.
- Technical expertise regarding data architecture, models and database design development.
- Strong knowledge of and experience with Java, SQL, XML s, Python, ETL frameworks and Databricks.
- Working knowledge/familiarity with Git version control.
- Strong Knowledge of analyzing datasets using Excel.
- Strong analytical skills with the ability to collect, organize, analyze, and disseminate significant amounts of information with attention to detail and accuracy.
- Proficient in learning new technologies with the ability to quickly understand capabilities and work with others to guide these into development.
- Good communication and presentation skills.
- Solid problem solving, decision making and analytical skills.
- Knowledge & working experience with Duck creek is an added plus.
- Knowledge & working experience with Insurity Policy Decisions and/or IEV is an added plus.
- Experience with JIRA.
- Experience being part of high-performance agile teams in a fast-paced environment.
- Must understand the system scope and project objectives to achieve project needs through matrix management and collaboration with other enterprise teams.
- Proven ability to produce results in the analysis, design, testing and deployment of applications.
- Strong team emphasis and relationship building skills; partners well with business and other IT/Data areas.
- Strong coaching / mentoring skills.
- Applies technical knowledge to determine solutions and solve complex problems.
- Ability to be proactive, self-motivated, detail-oriented, creative, inquisitive and persistent.
- Excellent communication and negotiation skills.
- Ability to organize, plan and implement work assignments, juggle competing demands and work under pressure of frequent and tight deadlines.
- The candidate will be responsible for design, and implement new solutions for complex data ingestions from multiple sources to enterprise data products with focus on automation, performance, resilience and scalability, etc.
- Partner with Lead Architect, Data Product Manager (Product Owner) and Lead Data Integration Engineer to create strategic solutions introducing new technologies.
- Work with stakeholders including Management, Domain leads, and Teams to assist with data-related technical issues and support their data infrastructure needs.
- Strong development & programming experience in Informatica (IICS), Python, ADF, Azure Synapse, snowflake, Cosmos and Databricks.
- Solid understanding of databases, real-time integration patterns and ETL/ELT best practices.
- Defining data retention policies, monitoring performance and advising any necessary infrastructure changes based on functional and non-functional requirements.
- Responsible for ensuring enterprise data policies, best practices, standards and processes are followed.
- Write up and maintain technical specifications, design documents and process flow.
- Mentor a team of onshore and offshore development resources to analyze, design, construct and test software development projects focused on analytics and data integration.
- Elaborate user stories for technical team and ensure that the team understands the deliverables.
- Effectively communicate, coordinate & collaborate with business, IT architecture and data teams across multi-functional areas to complete deliverables.
- Provide direction to the Agile development team and stakeholders throughout the project.
- Assist in Data Architecture design, tool selection and data flows analysis.
- Work with large amounts of data, interpret data, analyze results, perform gap analysis and provide ongoing reports.
- Handle ad-hoc analysis & report generation requests from the business.
- Respond to data related inquiries to support business and technical teams.
- 6+years of proven working experience in ETL methodologies, Data integration and data migration. Informatica IICS, Databricks/Spark & Python hands-on development skills a must.
- Clear hands-on experience with database systems - SQL server, Oracle, Azure Synapse, Snowflake and Cosmos, Cloud technologies (e.g., AWS, Azure, Google), and NoSQL databases (e.g., Cosmos, MongoDB, DynamoDB).
- Extensive experience developing complex solutions focused on data ecosystem solutions.
- Extensive knowledge of data and analytics framework supporting data lakes, warehouses, marts, reporting, etc.
- In depth knowledge of data engineering and architecture disciplines.
- Extensive experience working with Big Data tools and building data solutions for advanced analytics.
- Solid understanding of P&C Insurance data.
- Technical expertise regarding data architecture, models and database design development.
- Strong knowledge of and experience with Java, SQL, XML s, Python, ETL frameworks and Databricks.
- Working knowledge/familiarity with Git version control.
- Strong Knowledge of analyzing datasets using Excel.
- Strong analytical skills with the ability to collect, organize, analyze, and disseminate significant amounts of information with attention to detail and accuracy.
- Proficient in learning new technologies with the ability to quickly understand capabilities and work with others to guide these into development.
- Good communication and presentation skills.
- Solid problem solving, decision making and analytical skills.
- Knowledge & working experience with Duck creek is an added plus.
- Knowledge & working experience with Insurity Policy Decisions and/or IEV is an added plus.
- Experience with JIRA.
- Experience being part of high-performance agile teams in a fast-paced environment.
- Must understand the system scope and project objectives to achieve project needs through matrix management and collaboration with other enterprise teams.
- Proven ability to produce results in the analysis, design, testing and deployment of applications.
- Strong team emphasis and relationship building skills; partners well with business and other IT/Data areas.
- Strong coaching / mentoring skills.
- Applies technical knowledge to determine solutions and solve complex problems.
- Ability to be proactive, self-motivated, detail-oriented, creative, inquisitive and persistent.
- Excellent communication and negotiation skills.
- Ability to organize, plan and implement work assignments, juggle competing demands and work under pressure of frequent and tight deadlines.
ทักษะ:
ISO 27001, English
ประเภทงาน:
งานประจำ
เงินเดือน:
สามารถต่อรองได้
- รับผิดชอบการ Monitoring ควบคุมและจัดการระบบพื้นฐานเกี่ยวกับ ไฟฟ้า และระบบปรับอากาศ ระบบเครือข่าย เพื่อสนับสนุนการจัดการ.
- ตอบสนองความต้องการของลูกค้า และประสานงาน การติดตั้งและการแก้ไขปัญหาระบบของผู้บริการ (vendor) เพื่อให้ถูกต้องและสมบูรณ์ตามหลักปฎิบัติ.
- ควบคุมและประสานงานการบำรุงรักษาและการซ่อมแซม (Preventive Maintenance) ระบบพื้นฐานต่างๆ เครื่องกำเนิดไฟฟ้า Generator, เครื่องสำรองไฟฟ้า UPS, ระบบตู้ไฟฟ้า, ระบบปรับอากาศ และการติดตั้งอุปกรณ์ระบบเครือข่าย (Network) เป็นต้น.
- เป็น 1st level support & troubleshooting ของระบบ Facility ใน Data Center เช่น ระบบ Network, ระบบไฟฟ้า, ระบบปรับอากาศ เป็นต้น.
- จัดทำกระบวนการปฎิบัติงาน และคู่มือการทำงานในการดูแลระบบพื้นฐาน โดยอิงตามมาตราฐาน ISO หรือมาตรฐานอื่นที่เกี่ยวข้องกับการปฏิบัติงาน (เช่น ISO 20000 ด้านบริการ, ISO 27001 ด้านความปลอดภัย,ISO 50001 ด้านบริหารพลังงาน และอื่นๆ เช่น ISO22301, PCIDSS, TCOS) รวมทั้งรูปแบบใบบันทึก, รายงานต่าง ๆ.
- สรุปและรายงานผลสำหรับปัญหาวิกฤติใด ๆ ต่อหัวหน้าทีม รวมทั้ง การจัดทำรายงานสถิติ,รายงานวิเคราะห์แบบรายวัน, รายเดือน รายไตรมาส ด้วย.
- Bachelor s degree in electrical power, mechanic or related fields.
- Thai nationality, Male, Age 20 - 25 years old.
- Have basic technical knowledge in Data Center facilities (Electrical/Mechanical).
- Able to work under pressure.
- Able to work with a team.
- Fair communication in English.
ทักษะ:
Big Data, ETL, SQL
ประเภทงาน:
งานประจำ
เงินเดือน:
สามารถต่อรองได้
- Develop and maintain robust data pipelines to ingest, process, and transform raw data into formats suitable for LLM training.
- Conduct meeting with users to understand the data requirements and perform database design based on data understanding and requirements with consideration for performance.
- Maintain data dictionary, relationship and its interpretation.
- Analyze problem and find resolution, as well as work closely with administrators to monitor performance and advise any necessary infrastructure changes.
- Work with business domain experts, data scientists and application developers to identify data that is relevant for analysis.
- Develop big data solutions for batch processing and near real-time streaming.
- Own end-to-end data ETL/ELT process framework from Data Source to Data warehouse.
- Select and integrate appropriate tools and frameworks required to provide requested capabilities.
- Design and develop BI solutions.
- Hands-on development mentality, with a willingness to troubleshoot and solve complex problems.
- Keep abreast of new developments in the big data ecosystem and learn new technologies.
- Ability to effectively work independently and handle multiple priorities.
- Bachelor degree or higher in Computer Science, Computer Engineering, Information Technology, Management Information System or an IT related field.
- 3+ year's experiences in Data Management or Data Engineer (Retail or E-Commerce business is preferrable).
- Expert experience in query language (SQL), Databrick SQL, PostgreSQL.
- Experience in Big Data Technologies like Hadoop, Apache Spark, Databrick.
- Experience in Python is a must.
- Experience in Generative AI is a must.
- Knowledge in machine/statistical learning, data mining is a plus.
- Strong analytical, problem solving, communication and interpersonal skills.
- Having good attitude toward team working and willing to work hard.
- CP AXTRA | Lotus's
- CP AXTRA Public Company Limited.
- Nawamin Office: Buengkum, Bangkok 10230, Thailand.
- By applying for this position, you consent to the collection, use and disclosure of your personal data to us, our recruitment firms and all relevant third parties for the purpose of processing your application for this job position (or any other suitable positions within Lotus's and its subsidiaries, if any). You understand and acknowledge that your personal data will be processed in accordance with the law and our policy. .
ประสบการณ์:
3 ปีขึ้นไป
ทักษะ:
Big Data, Hive, SAS
ประเภทงาน:
งานประจำ
เงินเดือน:
สามารถต่อรองได้
- Design, implement, and maintain data analytics pipelines and processing systems.
- Experience of data modelling techniques and integration patterns.
- Write data transformation jobs through code.
- Analyze large datasets to extract insights and identify trends.
- Perform data management through data quality tests, monitoring, cataloging, and governance.
- Knowledge in data infrastructure ecosystem.
- Collaborate with cross-functional teams to identify opportunities to leverage data to drive business outcomes.
- Build data visualizations to communicate findings to stakeholders.
- A willingness to learn and find solutions to complex problems.
- Stay up-to-date with the latest developments in data analytics and science.
- Experience migrating from on-premise data stores to cloud solutions.
- Knowledge of system design and platform thinking to build sustainable solution.
- Practical experience with modern and traditional Big Data stacks (e.g BigQuery, Spark, Databricks, duckDB, Impala, Hive, etc).
- Experience working with data warehouse solutions ELT solutions, tools, and techniques (e.g. Airflow, dbt, SAS, Matillion, Nifi).
- Experience with agile software delivery and CI/CD processes.
- Bachelor's or Master's degree in computer science, statistics, engineering, or a related field.
- At least 3 years of experience in data analysis and modeling.
- Proficiency in Python, and SQL.
- Experience with data visualization tools such as Tableau, Grafana or similar.
- Familiarity with cloud computing platforms, such as GCP, AWS or Databricks.
- Strong problem-solving skills and the ability to work independently as well as collaboratively.
- This role offers a clear path to advance into machine learning and AI with data quality and management, providing opportunities to work on innovative projects and develop new skills in these exciting fields..
- Contact: [email protected] (K.Thipwimon).
- ท่านสามารถอ่านและศึกษานโยบายความเป็นส่วนตัวของธนาคารกรุงไทย จำกัด (มหาชน) ที่ https://krungthai.com/th/content/privacy-policy ทั้งนี้ ธนาคารไม่มีเจตนาหรือความจำเป็นใดๆ ที่จะประมวลผลข้อมูลส่วนบุคคลที่มีความอ่อนไหว รวมถึงข้อมูลที่เกี่ยวข้องศาสนาและ/หรือหมู่โลหิต ซึ่งอาจปรากฏอยู่ในสำเนาบัตรประจำตัวประชาชนของท่านแต่อย่างใด ดังนั้น กรุณาอย่าอัปโหลดเอกสารใดๆ รวมถึงสำเนาบัตรประจำตัวประชาชน หรือกรอกข้อมูลส่วนบุคคลที่มีความอ่อนไหวหรือข้อมูลอื่นใด ซึ่งไม่เกี่ยวข้องหรือไม่จำเป็นสำหรับวัตถุประสงค์ในการสมัครงานไว้บนเว็บไซต์ นอกจากนี้ กรุณาดำเนินการให้แน่ใจว่าได้ดำเนินการลบข้อมูลส่วนบุคคลที่มีความอ่อนไหว (ถ้ามี) ออกจากเรซูเม่และเอกสารอื่นใดก่อนที่จะอัปโหลดเอกสารดังกล่าวไว้บนเว็บไซต์แล้วด้วย ทั้งนี้ ธนาคารมีความจำเป็นต้องเก็บรวบรวมข้อมูลส่วนบุคคลเกี่ยวกับประวัติอาชญากรรมของท่านเพื่อบรรลุวัตถุประสงค์ในการพิจารณารับบุคคลเข้าทำงาน หรือการตรวจสอบคุณสมบัติ ลักษณะต้องห้าม หรือพิจารณาความเหมาะสมของบุคคลที่จะให้ดำรงตำแหน่ง ซึ่งการให้ความยินยอมเพื่อเก็บรวบรวม ใช้ หรือเปิดเผยข้อมูลส่วนบุคคลเกี่ยวกับประวัติอาชญากรรมของท่านมีความจำเป็นสำหรับการเข้าทำสัญญาและการได้รับการพิจารณาตามวัตถุประสงค์ดังกล่าวข้างต้น ในกรณีที่ท่านไม่ให้ความยินยอมในการเก็บรวบรวม ใช้ หรือเปิดเผยข้อมูลส่วนบุคคลเกี่ยวกับประวัติอาชญากรรม หรือมีการถอนความยินยอมในภายหลัง ธนาคารอาจไม่สามารถดำเนินการเพื่อบรรลุวัตถุประสงค์ดังกล่าวข้างต้นได้ และอาจ ทำให้ท่านสูญเสียโอกาสในการได้รับการพิจารณารับเข้าทำงานกับธนาคาร .
ทักษะ:
ETL, SQL, Hadoop
ประเภทงาน:
งานประจำ
เงินเดือน:
สามารถต่อรองได้
- Conduct meeting with users to understand the data requirements and perform database design based on data understanding and requirements with consideration for performance.
- Maintain data dictionary, relationship and its interpretation.
- Analyze problem and find resolution, as well as work closely with administrators to monitor performance and advise any necessary infrastructure changes.
- Work with business domain experts, data scientists and application developers to identify data that is relevant for analysis.
- Develop big data solutions for batch processing and near real-time streaming.
- Own end-to-end data ETL/ELT process framework from Data Source to Data warehouse.
- Select and integrate appropriate tools and frameworks required to provide requested capabilities.
- Design and develop BI solutions.
- Hands-on development mentality, with a willingness to troubleshoot and solve complex problems.
- Keep abreast of new developments in the big data ecosystem and learn new technologies.
- Ability to effectively work independently and handle multiple priorities.
- Master degree or higher in Computer Science, Computer Engineering, Information Technology, Management Information System or an IT related field.
- 3+ year's experience in Data Management or Data Engineer (Retail or E-Commerce business is preferable).
- Expert experience in query language (SQL), Databrick SQL, PostgreSQL.
- Experience in BigData Technologies like Hadoop, Apache Spark, Databrick.
- Experience in Python is a must.
- Knowledge in machine/statistical learning, data mining is a plus.
- Experience in Generative AI is a plus.
- Strong analytical, problem solving, communication and interpersonal skills.
- Having good attitude toward team working and willing to work hard.
- CP AXTRA | Lotus's
- CP AXTRA Public Company Limited.
- Nawamin Office: Buengkum, Bangkok 10230, Thailand.
- By applying for this position, you consent to the collection, use and disclosure of your personal data to us, our recruitment firms and all relevant third parties for the purpose of processing your application for this job position (or any other suitable positions within Lotus's and its subsidiaries, if any). You understand and acknowledge that your personal data will be processed in accordance with the law and our policy. .
ประสบการณ์:
ไม่จำเป็นต้องมีประสบการณ์ทำงาน
ทักษะ:
Mechanical Engineering, Electrical Engineering, English
ประเภทงาน:
งานประจำ
- Provide day to day installation, maintenance, and repair of all facilities in the data center.
- 24x7 shift work responsibility when qualified and designated.
- Provide requested reporting and documentation.
- Support of facility, development, and construction teams.
- Perform tasks as assigned by DC operation manager.
- Respond to customer requests, power, cooling, and facility audits.
- First tier investigate any power, communication, or cooling anomalies.
- Attend assigned meetings and training.
- Assist in ensuring customer compliance with GSA Acceptance Usage Policy (AUP).
- Provide technical escort when needed.
- Job Qualifications.
- Must be familiar with safety requirements and OSHA regulations or Thailand safety regulations.
- Basic understanding of electrical and mechanical systems that may be employed in a data center environment. This may include electrical feeders, transformers, generators, switchgear, UPS systems, DC power systems, ATS/STS units, PDU units, air handling units, cooling towers, and fire suppression systems.
- Familiar with Interpret wiring diagrams, schematics, and electrical drawings.
- Ability to express ideas clearly, concisely, and effectively with contractors performing maintenance or upgrades on systems installed in the data center environment.
- Excellent verbal, written, and interpersonal communication skills.
- Ability to analyze and make suggestions for problem resolution.
- Solve problems with good initiative and sound judgment.
- Creativity, problem solving skills, negotiation and systematic thinking.
- Fluent in English both written and verbal (Minimum 500 TOEIC score).
- Goal-Oriented, Unity, Learning, Flexible.
ทักษะ:
ETL, Compliance, SQL
ประเภทงาน:
งานประจำ
เงินเดือน:
สามารถต่อรองได้
- Design, implement, and manage end-to-end data pipelines architectures.
- Configure and maintain data ingest workflows (ETL) across several production systems.
- Transform data into Data Mart, Data Model that can be easily analyzed.
- Ensure the data is correct and is in a highly usable state by the time and good performance.
- Hands-on development mentality, with a willingness to troubleshoot and solve complex problems.
- Ensure compliance with data governance and security policies.
- Bachelor s Degree or higher in Computer Science, Information Technology, Computer Engineering, or related field.
- Minimum 3 years of work experience in Data engineer.
- Strong SQL Command and knowledge on Non-SQL tools and languages.
- Experience in ETL tools & Data Model such as SSIS, SSAS.
- Experience working on Big Data platform is advantage.
- Experience in cloud platform such as AWS, GCP.
- Ability to develop Python, R is advantage.
- Good business understanding, able to identify business problems, formulate business goals, and find relevant data.
- ประสบการณ์ที่จำเป็น.
- 3 ปี.
- ระดับตำแหน่งงาน.
- ระดับหัวหน้างาน.
- เงินเดือน.
- สามารถต่อรองได้.
- สายงาน.
- ไอที / เขียนโปรแกรม.
- วิศวกรรม.
- ประเภทงาน.
- งานประจำ.
- เกี่ยวกับบริษัทจำนวนพนักงาน:500-1000 คน.
- ประเภทบริษัท:อุตสาหกรรมสินค้าอุปโภคบริโภค.
- ที่ตั้งบริษัท:กรุงเทพ.
- เว็บไซต์:www.osotspa.com.
- ก่อตั้งเมื่อปี:1891.
- คะแนน:4.5/5.
- เราจะเป็นบริษัทอุปโภค-บริโภคชั้นนำของประเทศไทยที่สอดคล้องกับไลฟ์สไตล์ของผู้บริโภค และเป็นที่ยอมรับอย่างกว้างขวางในภูมิภาคอาเซียน มากกว่าหนึ่งศตวรรษกับความสำเร็จที่น่าภาคภูมิใจของโอสถสภา เรายังคงมุ่งมั่นวิจัยและคิดค้นพัฒนาผลิตภัณฑ์เพื่อตอบสนองความต้องการ ของคนไทย เพื่อให้คนไทยมีคุณภาพชีวิตที่ดีขึ้น เราพัฒนาศักยภาพเพื่อสร้างความเชื่อมั่นและคุณภาพชีวิตให้กับผู้บริโภคชาวไทยอย่างไม่หยุดยั้ง โอสถสภา ในวันนี้ เราพูดได้อย่างภูมิใจว่า เรามีการดำเนินธุรกิจอย่างครบวงจร นับแต่การวิจัยถึงความต้องการของผู้บริโภค สู่การคิดค้นและพัฒนานวัตกรรมใหม่ๆ สู่กระบวนการผลิต การทำตลาดการกระจายสินค้าและ การจัดกิจกรรมส่งเสริมการตลาดและการขายอย่างต่อเนื่อง เพื่อให้มั่นใจว่า สินค้าของโอสถสภาเข้าเป็นส่วนหนึ่งของวิถีชีวิต และไลฟ์สไตล์ของผู้บริโภคอย่างแท้จริง.
- ร่วมงานกับเรา: ปัจจุบัน โอสถสภามีอายุมากกว่า 123 ปี เรายังคงมุ่งมั่นพัฒนาผลิตภัณฑ์ที่มีคุณภาพต่อไปอย่างไม่หยุดยั้ง ภาใต้ปรัชญาข้อปฏิบัติที่ยึดมั่น "เห็นแก่ประโยชน์ของผู้อื่นมากกว่าตนเอง คิดถึงน้ำใจของคนอื่นมากกว่าเงินตรา มีความสัตย์ซื่อในการประกอบอาชีพ รักษาไว้ซึ่งจรรยาบรรณของธุรกิจ".
- เขตที่ตั้งที่ทำงาน: บางกะปิ.
- สำนักงานใหญ่: 348, Ramkhamhaeng Road.
ทักษะ:
ETL, Python, Java
ประเภทงาน:
งานประจำ
เงินเดือน:
สามารถต่อรองได้
- Design, develop, and maintain scalable data pipelines and ETL processes.
- Implement and optimize data storage solutions, including data warehouses and data lakes.
- Collaborate with data scientists and analysts to understand data requirements and provide efficient data access.
- Ensure data quality, consistency, and reliability across all data systems.
- Develop and maintain data models and schemas.
- Implement data security and access control measures.
- Optimize query performance and data retrieval processes.
- Evaluate and integrate new data technologies and tools.
- Mentor junior data engineers and provide technical leadership.
- Collaborate with cross-functional teams to support data-driven decision-making.
- RequirementsBachelor's or Master's degree in Computer Science, Engineering, or a related field.
- 5+ years of experience in data engineering or related roles.
- Strong programming skills in Python, Java, or Scala.
- Extensive experience with big data technologies such as Hadoop, Spark, and Hive.
- Proficiency in SQL and experience with both relational and NoSQL databases.
- Experience with cloud platforms (AWS, Azure, or GCP) and their data services.
- Knowledge of data modeling, data warehousing, and ETL best practices.
- Familiarity with data visualization tools (e.g., Tableau, Power BI).
- Experience with version control systems (e.g., Git) and CI/CD pipelines.
- Strong problem-solving skills and attention to detail.
- Excellent communication and collaboration skills.
ประเภทงาน:
งานประจำ
เงินเดือน:
สามารถต่อรองได้
- Analyze data and provide business insight.
- Assess risk and design mitigation actions based on data insight.
- Design dashboards / predictive models for decision-making management.
- Lead and Supervise team by projects.
ทักษะ:
Data Analysis, English
ประเภทงาน:
งานประจำ
เงินเดือน:
สามารถต่อรองได้
- Data Innovation Team.
- ปฎิบัติงานยัง บริษัท Infinitas by Krungthai
- As a data analyst in Infinitas, you would expect a new level of analytical experiences here. Being part of Data Innovation team, your goal is to make data useful for all stakeholders. Under the big umbrella of the leading bank in Thailand, Infinitas leverage the largest financial data sets from both traditional banking and mobile banking services. From customer digital footprint to branch, ATM and call center. We mea ...
- Job Responsibilities
- Conduct data inventory research with product owner, business owner and IT BA to gain full understandings of data availability.
- Communicate with business owners to translate business problem/challenge into actionable analytical solution
- Initiate EDA ideas to tag hidden opportunities for customer, product, channel and other various areas.
- Analyze digital and traditional user journey funnel and customer persona
- Visualize data for fast decision making and insight interpretation
- Define customer segmentations for strategy planning and marketing targeting
- Plan holistic A/B testing campaigns to evaluate data values on business impact
- Design and fulfill monitoring dashboards and automated reports
- English as working language
- Minimum of 3 years data analytics related working experiences
- At least 1 year of working experience directly communicate to business team
- Proficient in Python or SQL
- Advanced hands on experiences with visualization tool
- Strong communication and analytical thinking skills
- Good balance of data and business knowledge
- Fintech or banking industry
- Internet companies with mobile application.
- ท่านสามารถอ่านและศึกษานโยบายความเป็นส่วนตัวของธนาคารกรุงไทย จำกัด (มหาชน) ที่ https://krungthai.com/th/content/privacy-policy ทั้งนี้ ธนาคารไม่มีเจตนาหรือความจำเป็นใดๆ ที่จะประมวลผลข้อมูลส่วนบุคคลที่มีความอ่อนไหว รวมถึงข้อมูลที่เกี่ยวข้องศาสนาและ/หรือหมู่โลหิต ซึ่งอาจปรากฏอยู่ในสำเนาบัตรประจำตัวประชาชนของท่านแต่อย่างใด ดังนั้น กรุณาอย่าอัปโหลดเอกสารใดๆ รวมถึงสำเนาบัตรประจำตัวประชาชน หรือกรอกข้อมูลส่วนบุคคลที่มีความอ่อนไหวหรือข้อมูลอื่นใด ซึ่งไม่เกี่ยวข้องหรือไม่จำเป็นสำหรับวัตถุประสงค์ในการสมัครงานไว้บนเว็บไซต์ นอกจากนี้ กรุณาดำเนินการให้แน่ใจว่าได้ดำเนินการลบข้อมูลส่วนบุคคลที่มีความอ่อนไหว (ถ้ามี) ออกจากเรซูเม่และเอกสารอื่นใดก่อนที่จะอัปโหลดเอกสารดังกล่าวไว้บนเว็บไซต์แล้วด้วย ทั้งนี้ ธนาคารมีความจำเป็นต้องเก็บรวบรวมข้อมูลส่วนบุคคลเกี่ยวกับประวัติอาชญากรรมของท่านเพื่อบรรลุวัตถุประสงค์ในการพิจารณารับบุคคลเข้าทำงาน หรือการตรวจสอบคุณสมบัติ ลักษณะต้องห้าม หรือพิจารณาความเหมาะสมของบุคคลที่จะให้ดำรงตำแหน่ง ซึ่งการให้ความยินยอมเพื่อเก็บรวบรวม ใช้ หรือเปิดเผยข้อมูลส่วนบุคคลเกี่ยวกับประวัติอาชญากรรมของท่านมีความจำเป็นสำหรับการเข้าทำสัญญาและการได้รับการพิจารณาตามวัตถุประสงค์ดังกล่าวข้างต้น ในกรณีที่ท่านไม่ให้ความยินยอมในการเก็บรวบรวม ใช้ หรือเปิดเผยข้อมูลส่วนบุคคลเกี่ยวกับประวัติอาชญากรรม หรือมีการถอนความยินยอมในภายหลัง ธนาคารอาจไม่สามารถดำเนินการเพื่อบรรลุวัตถุประสงค์ดังกล่าวข้างต้นได้ และอาจ ทำให้ท่านสูญเสียโอกาสในการได้รับการพิจารณารับเข้าทำงานกับธนาคาร .
ทักษะ:
Excel, Python, Power BI
ประเภทงาน:
งานประจำ
เงินเดือน:
สามารถต่อรองได้
- Create, Develop and Monitor Auto Replenishment & Parameter.
- Maintain and adjust parameters to optimize stock availability/ stock level during normal/month and promotion periods.
- Investigate and identify root cause of overstocking and OOS at Store/DC.
- Monitoring of target stock on normal/seasonal period to suit with business sale target.
- Adjust daily sales in system to correct average daily sales after promotion period.
- Forecasting demand in each promotion campaign to manage Parameter setting.
- Develop Daily KPI Dashboard to monitor sales performance VS Suggest number from system.
- Bachelor Degree of Supply Chain, Logistic, Economics, Mathematic and other relate filed.
- Have experience in Data analyst, Inventory Analyst, Inventory Planning at least 3-5 Years.
- Strong Mathematic skills is a must.
- Excellent for Excel (Pivot, VLOOKUP, VBA), Python, Power BI, Tableau.
- Have experience in Retail business /FMCG would be advantage.
- Good Analytic skills.
ทักษะ:
Excel, Meet Deadlines, Power BI
ประเภทงาน:
งานประจำ
เงินเดือน:
สามารถต่อรองได้
- Collect HR Data from a variety of data sources, including HRMS, Excel, Google Sheet and Text files.
- Preparing data and developing analytical reports in the HR area, such as HR Dashboard, to support executive decision-making.
- Analyzes and prepares Executive Dashboard in order to predict and find solutions with HR team member.
- Support the HR team by providing insightful information to support HR strategies.
- Bachelor s Degree in Business Administration or a related field.
- Minimum of 2 years of experience in an HR Data Analyst or HRIS role.
- Ability to manage multiple tasks/projects, work under pressure, and meet deadlines.
- Strong verbal and written communication skills, with excellent presentation abilities.
- Results-driven and solution-oriented.
- Experience with data visualization and analytics platforms such as Microsoft Power BI or Tableau.
- Proficiency in SQL, including platforms like PostgreSQL and Oracle DB.
ทักษะ:
Excel, Python, Power BI
ประเภทงาน:
งานประจำ
เงินเดือน:
สามารถต่อรองได้
- Create, Develop and Monitor Auto Replenishment & Parameter.
- Maintain and adjust parameters to optimize stock availability/ stock level during normal/month and promotion periods.
- Investigate and identify root cause of overstocking and OOS at Store/DC.
- Monitoring of target stock on normal/seasonal period to suit with business sale target.
- Adjust daily sales in system to correct average daily sales after promotion period.
- Forecasting demand in each promotion campaign to manage Parameter setting.
- Develop Daily KPI Dashboard to monitor sales performance VS Suggest number from system.
- Bachelor Degree of Supply Chain, Logistic, Economics, Mathematic and other relate filed.
- Have experience in Data analyst, Inventory Analyst, Inventory Planning at least 3-5 Years.
- Have experience in Retail business /FMCG would be advantage.
- Excellent for Excel (Pivot, VLOOKUP, VBA), Python, Power BI, Tableau.
- Good Analytic skills.
ทักษะ:
Compliance, Data Analysis, Power BI
ประเภทงาน:
งานประจำ
เงินเดือน:
สามารถต่อรองได้
- Build and maintain an HR data repository tailored to the food business under ThaiBev group, focusing on metrics critical to food operations, such as labor productivity, turnover by location, and shift coverage efficiency.
- Ensure data integrity and compliance with industry-specific labor regulations, maintaining a transparent and accurate source of HR information.
- Collaborate with operations teams to integrate labor data from multiple food business units, enabling holistic insights across various branches and regions.
- Assist HR Line Manager on Strategic HR Analytics for Workforce OptimizationConduct data analysis on staffing patterns, turnover rates, and workforce efficiency to identify optimization opportunities aligned with food business cycles.
- Use predictive analytics to anticipate workforce needs for peak and off-peak seasons, aiding in proactive staffing and cost control with operation team to centralization.
- Assist on Commercial Structure and Labor Cost Management for Food OperationsAnalyze labor costs relative to revenue and operational efficiency within different food outlets, providing insights to optimize staffing while maximizing profitability.
- Support the development of labor cost budgets that align with product pricing and sales targets in the food sector, helping maintain competitive yet profitable operations.
- Generate regular reports on labor cost performance against targets, identifying areas for improvement and enabling business leaders to adjust strategy as needed.
- Be Leader on developing Power BI Development for Real-Time Food Business InsightsDesign and deploy Power BI dashboards specific to food operations, offering real-time insights on key metrics like labor costs, staffing levels, and turnover rates across outlets.
- Collaborate with senior leaders in the food division to customize dashboards, highlighting KPIs that impact food production, service speed, and customer satisfaction.
- Continuously update Power BI capabilities to provide comprehensive, up-to-date views on HR metrics essential to food business strategy.
- 3+ years of experience in analytics, data management not specific in HR experience.
- Demonstrated proficiency in Power BI development and advanced Excel skills, including VBA, macros, and pivot tables.
- Prior experience in labor cost analysis, commercial structure evaluation.
- Contact Information:-.
- Oishi Group Public Company Limited.
- CW Tower, No.90. Ratchadapisek Road, Huai Khwang, Bangkok.
ทักษะ:
Big Data, Java, Python
ประเภทงาน:
งานประจำ
เงินเดือน:
สามารถต่อรองได้
- Background in programming, databases and/or big data technologies OR.
- BS/MS in software engineering, computer science, economics or other engineering fields.
- Partner with Data Architect and Data Integration Engineer to enhance/maintain optimal data pipeline architecture aligned to published standards.
- Assemble medium, complex data sets that meet functional /non-functional business requirements.
- Design and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
- Build the infrastructure required for optimal extraction transformation, and loading of data from a wide variety of data sources big data technologies.
- Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics.
- Work with stakeholders including Domain leads, and Teams to assist with data-related technical issues and support their data infrastructure needs.
- Ensure technology footprint adheres to data security policies and procedures related to encryption, obfuscation and role based access.
- Create data tools for analytics and data scientist team members.
- Functional Competency.
- Knowledge of data and analytics framework supporting data lakes, warehouses, marts, reporting, etc.
- Defining data retention policies, monitoring performance and advising any necessary infrastructure changes based on functional and non-functional requirements.
- In depth knowledge of data engineering discipline.
- Extensive experience working with Big Data tools and building data solutions for advanced analytics.
- Minimum of 5+ years' hands-on experience with a strong data background.
- Solid programming skills in Java, Python and SQL.
- Clear hands-on experience with database systems - Hadoop ecosystem, Cloud technologies (e.g. AWS, Azure, Google), in-memory database systems (e.g. HANA, Hazel cast, etc) and other database systems - traditional RDBMS (e.g. Teradata, SQL Server, Oracle), and NoSQL databases (e.g. Cassandra, MongoDB, DynamoDB).
- Practical knowledge across data extraction and transformation tools - traditional ETL tools (e.g. Informatica, Ab Initio, Altryx) as well as more recent big data tools.
- Educational.
- Background in programming, databases and/or big data technologies OR.
- BS/MS in software engineering, computer science, economics or other engineering fields.
- Partner with Data Architect and Data Integration Engineer to enhance/maintain optimal data pipeline architecture aligned to published standards.
- Assemble medium, complex data sets that meet functional /non-functional business requirements.
- Design and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
- Build the infrastructure required for optimal extraction transformation, and loading of data from a wide variety of data sources big data technologies.
- Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics.
- Work with stakeholders including Domain leads, and Teams to assist with data-related technical issues and support their data infrastructure needs.
- Ensure technology footprint adheres to data security policies and procedures related to encryption, obfuscation and role based access.
- Create data tools for analytics and data scientist team members.
- Functional Competency.
- Knowledge of data and analytics framework supporting data lakes, warehouses, marts, reporting, etc.
- Defining data retention policies, monitoring performance and advising any necessary infrastructure changes based on functional and non-functional requirements.
- In depth knowledge of data engineering discipline.
- Extensive experience working with Big Data tools and building data solutions for advanced analytics.
- Minimum of 5+ years' hands-on experience with a strong data background.
- Solid programming skills in Java, Python and SQL.
- Clear hands-on experience with database systems - Hadoop ecosystem, Cloud technologies (e.g. AWS, Azure, Google), in-memory database systems (e.g. HANA, Hazel cast, etc) and other database systems - traditional RDBMS (e.g. Teradata, SQL Server, Oracle), and NoSQL databases (e.g. Cassandra, MongoDB, DynamoDB).
- Practical knowledge across data extraction and transformation tools - traditional ETL tools (e.g. Informatica, Ab Initio, Altryx) as well as more recent big data tools.
ทักษะ:
ETL, Data Analysis, Industry trends
ประเภทงาน:
งานประจำ
เงินเดือน:
฿70,000 - ฿90,000, สามารถต่อรองได้
- Analyze and interpret complex data sets to uncover insights and trends that drive business strategy and decision-making.
- Collaborate with cross-functional teams to understand their data needs and provide actionable recommendations.
- Design and maintain dashboards, reports, and visualizations using tools to communicate insights effectively.
- Extract data from various sources, including databases, APIs, and third-party services, ensuring data quality and accuracy.
- Develop and implement data models, ETL processes, and automated reporting solutions to streamline data analysis.
- Stay updated with industry trends and new technologies to enhance the company's data analytics capabilities.
- Participate in data governance initiatives, ensuring compliance with data privacy and security regulations.
- Requirements/Qualifications(must have):.
- Bachelor's degree in Statistics, Data Science, or a related field; an MBA or advanced degree is a plus.
- Minimum of 5 years of experience in business intelligence or data analysis, preferably in a fast-paced e-commerce environment.
- Proficient in SQL and at least one data visualization tool (e.g., Tableau, Power BI), with a solid understanding of data warehousing concepts.
- Strong analytical skills, with the ability to manipulate, clean, and derive insights from large datasets.
- Effective communicator with excellent presentation skills, capable of translating complex data into simple, actionable insights for non-technical stakeholders.
ทักษะ:
Finance, Compliance
ประเภทงาน:
งานประจำ
เงินเดือน:
สามารถต่อรองได้
- Conduct detailed analysis of Enterprise Service revenue to identify trends in products and services within AIS Group.
- Verify the accuracy and completeness of revenue collection, promotion packages, and new services to ensure compliance with business conditions.
- Develop appropriate QA measures to minimize revenue loss and operational errors.
- Detect and investigate irregularities affecting revenue, such as real loss, opportunity loss, and fraud.
- Collaborate with relevant departments to address and rectify issues impacting revenue.
- Ensure the accuracy of service charges, promotion packages, and offerings for enterprise customers.
- Review and validate the calculation of postpaid voice, IDD, and IR services in the RBM system to prevent revenue loss.
- Utilize data analytics skills to analyze data from various sources, reflecting trends, performance, and efficiency of products and services.
- Prepare analysis reports to support management in strategy formulation and risk assessment.
- 1
- 2
- 3
- 4
- 5
- 6
- 14