- No elements found. Consider changing the search query.
ทักษะ:
Big Data, Java, Python
ประเภทงาน:
งานประจำ
เงินเดือน:
สามารถต่อรองได้
- Background in programming, databases and/or big data technologies OR.
- BS/MS in software engineering, computer science, economics or other engineering fields.
- Partner with Data Architect and Data Integration Engineer to enhance/maintain optimal data pipeline architecture aligned to published standards.
- Assemble medium, complex data sets that meet functional /non-functional business requirements.
- Design and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
- Build the infrastructure required for optimal extraction transformation, and loading of data from a wide variety of data sources big data technologies.
- Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics.
- Work with stakeholders including Domain leads, and Teams to assist with data-related technical issues and support their data infrastructure needs.
- Ensure technology footprint adheres to data security policies and procedures related to encryption, obfuscation and role based access.
- Create data tools for analytics and data scientist team members.
- Functional Competency.
- Knowledge of data and analytics framework supporting data lakes, warehouses, marts, reporting, etc.
- Defining data retention policies, monitoring performance and advising any necessary infrastructure changes based on functional and non-functional requirements.
- In depth knowledge of data engineering discipline.
- Extensive experience working with Big Data tools and building data solutions for advanced analytics.
- Minimum of 5+ years' hands-on experience with a strong data background.
- Solid programming skills in Java, Python and SQL.
- Clear hands-on experience with database systems - Hadoop ecosystem, Cloud technologies (e.g. AWS, Azure, Google), in-memory database systems (e.g. HANA, Hazel cast, etc) and other database systems - traditional RDBMS (e.g. Teradata, SQL Server, Oracle), and NoSQL databases (e.g. Cassandra, MongoDB, DynamoDB).
- Practical knowledge across data extraction and transformation tools - traditional ETL tools (e.g. Informatica, Ab Initio, Altryx) as well as more recent big data tools.
- Educational.
- Background in programming, databases and/or big data technologies OR.
- BS/MS in software engineering, computer science, economics or other engineering fields.
- Partner with Data Architect and Data Integration Engineer to enhance/maintain optimal data pipeline architecture aligned to published standards.
- Assemble medium, complex data sets that meet functional /non-functional business requirements.
- Design and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
- Build the infrastructure required for optimal extraction transformation, and loading of data from a wide variety of data sources big data technologies.
- Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics.
- Work with stakeholders including Domain leads, and Teams to assist with data-related technical issues and support their data infrastructure needs.
- Ensure technology footprint adheres to data security policies and procedures related to encryption, obfuscation and role based access.
- Create data tools for analytics and data scientist team members.
- Functional Competency.
- Knowledge of data and analytics framework supporting data lakes, warehouses, marts, reporting, etc.
- Defining data retention policies, monitoring performance and advising any necessary infrastructure changes based on functional and non-functional requirements.
- In depth knowledge of data engineering discipline.
- Extensive experience working with Big Data tools and building data solutions for advanced analytics.
- Minimum of 5+ years' hands-on experience with a strong data background.
- Solid programming skills in Java, Python and SQL.
- Clear hands-on experience with database systems - Hadoop ecosystem, Cloud technologies (e.g. AWS, Azure, Google), in-memory database systems (e.g. HANA, Hazel cast, etc) and other database systems - traditional RDBMS (e.g. Teradata, SQL Server, Oracle), and NoSQL databases (e.g. Cassandra, MongoDB, DynamoDB).
- Practical knowledge across data extraction and transformation tools - traditional ETL tools (e.g. Informatica, Ab Initio, Altryx) as well as more recent big data tools.
ทักษะ:
ETL, Java, Python
ประเภทงาน:
งานประจำ
เงินเดือน:
สามารถต่อรองได้
- Design, develop, optimize, and maintain data architecture and pipelines that adhere to ETL principles and business goals.
- Create data products for analytics and data scientist team members to improve their productivity.
- Lead the evaluation, implementation and deployment of emerging tools and process for analytic data engineering in order to improve our productivity as a team.
- Develop and deliver communication and education plans on analytic data engineering capabilities, standards, and processes.
- RequirementsPrevious experience as a data engineer or in a similar role.
- Technical expertise with data models, data mining, and segmentation techniques.
- Knowledge of programming languages (e.g. Java and Python).
- Hands-on experience with SQL database design using Hadoop or BigQuery and experience with a variety of relational, NoSQL, and cloud database technologies.
- Great numerical and analytical skill.
- Worked with BI tools such as Tableau, Power BI.
- Conceptual knowledge of data and analytics, such as dimensional modeling, ETL, reporting tools, data governance, data warehousing, structured and unstructured data.
ประสบการณ์:
3 ปีขึ้นไป
ทักษะ:
Big Data, Hive, SAS
ประเภทงาน:
งานประจำ
เงินเดือน:
สามารถต่อรองได้
- Design, implement, and maintain data analytics pipelines and processing systems.
- Experience of data modelling techniques and integration patterns.
- Write data transformation jobs through code.
- Analyze large datasets to extract insights and identify trends.
- Perform data management through data quality tests, monitoring, cataloging, and governance.
- Knowledge in data infrastructure ecosystem.
- Collaborate with cross-functional teams to identify opportunities to leverage data to drive business outcomes.
- Build data visualizations to communicate findings to stakeholders.
- A willingness to learn and find solutions to complex problems.
- Stay up-to-date with the latest developments in data analytics and science.
- Experience migrating from on-premise data stores to cloud solutions.
- Knowledge of system design and platform thinking to build sustainable solution.
- Practical experience with modern and traditional Big Data stacks (e.g BigQuery, Spark, Databricks, duckDB, Impala, Hive, etc).
- Experience working with data warehouse solutions ELT solutions, tools, and techniques (e.g. Airflow, dbt, SAS, Matillion, Nifi).
- Experience with agile software delivery and CI/CD processes.
- Bachelor's or Master's degree in computer science, statistics, engineering, or a related field.
- At least 3 years of experience in data analysis and modeling.
- Proficiency in Python, and SQL.
- Experience with data visualization tools such as Tableau, Grafana or similar.
- Familiarity with cloud computing platforms, such as GCP, AWS or Databricks.
- Strong problem-solving skills and the ability to work independently as well as collaboratively.
- This role offers a clear path to advance into machine learning and AI with data quality and management, providing opportunities to work on innovative projects and develop new skills in these exciting fields..
- Contact: [email protected] (K.Thipwimon).
- ท่านสามารถอ่านและศึกษานโยบายความเป็นส่วนตัวของธนาคารกรุงไทย จำกัด (มหาชน) ที่ https://krungthai.com/th/content/privacy-policy ทั้งนี้ ธนาคารไม่มีเจตนาหรือความจำเป็นใดๆ ที่จะประมวลผลข้อมูลส่วนบุคคลที่มีความอ่อนไหว รวมถึงข้อมูลที่เกี่ยวข้องศาสนาและ/หรือหมู่โลหิต ซึ่งอาจปรากฏอยู่ในสำเนาบัตรประจำตัวประชาชนของท่านแต่อย่างใด ดังนั้น กรุณาอย่าอัปโหลดเอกสารใดๆ รวมถึงสำเนาบัตรประจำตัวประชาชน หรือกรอกข้อมูลส่วนบุคคลที่มีความอ่อนไหวหรือข้อมูลอื่นใด ซึ่งไม่เกี่ยวข้องหรือไม่จำเป็นสำหรับวัตถุประสงค์ในการสมัครงานไว้บนเว็บไซต์ นอกจากนี้ กรุณาดำเนินการให้แน่ใจว่าได้ดำเนินการลบข้อมูลส่วนบุคคลที่มีความอ่อนไหว (ถ้ามี) ออกจากเรซูเม่และเอกสารอื่นใดก่อนที่จะอัปโหลดเอกสารดังกล่าวไว้บนเว็บไซต์แล้วด้วย ทั้งนี้ ธนาคารมีความจำเป็นต้องเก็บรวบรวมข้อมูลส่วนบุคคลเกี่ยวกับประวัติอาชญากรรมของท่านเพื่อบรรลุวัตถุประสงค์ในการพิจารณารับบุคคลเข้าทำงาน หรือการตรวจสอบคุณสมบัติ ลักษณะต้องห้าม หรือพิจารณาความเหมาะสมของบุคคลที่จะให้ดำรงตำแหน่ง ซึ่งการให้ความยินยอมเพื่อเก็บรวบรวม ใช้ หรือเปิดเผยข้อมูลส่วนบุคคลเกี่ยวกับประวัติอาชญากรรมของท่านมีความจำเป็นสำหรับการเข้าทำสัญญาและการได้รับการพิจารณาตามวัตถุประสงค์ดังกล่าวข้างต้น ในกรณีที่ท่านไม่ให้ความยินยอมในการเก็บรวบรวม ใช้ หรือเปิดเผยข้อมูลส่วนบุคคลเกี่ยวกับประวัติอาชญากรรม หรือมีการถอนความยินยอมในภายหลัง ธนาคารอาจไม่สามารถดำเนินการเพื่อบรรลุวัตถุประสงค์ดังกล่าวข้างต้นได้ และอาจ ทำให้ท่านสูญเสียโอกาสในการได้รับการพิจารณารับเข้าทำงานกับธนาคาร .
ทักษะ:
ETL, SQL, Hadoop
ประเภทงาน:
งานประจำ
เงินเดือน:
สามารถต่อรองได้
- Conduct meeting with users to understand the data requirements and perform database design based on data understanding and requirements with consideration for performance.
- Maintain data dictionary, relationship and its interpretation.
- Analyze problem and find resolution, as well as work closely with administrators to monitor performance and advise any necessary infrastructure changes.
- Work with business domain experts, data scientists and application developers to identify data that is relevant for analysis.
- Develop big data solutions for batch processing and near real-time streaming.
- Own end-to-end data ETL/ELT process framework from Data Source to Data warehouse.
- Select and integrate appropriate tools and frameworks required to provide requested capabilities.
- Design and develop BI solutions.
- Hands-on development mentality, with a willingness to troubleshoot and solve complex problems.
- Keep abreast of new developments in the big data ecosystem and learn new technologies.
- Ability to effectively work independently and handle multiple priorities.
- Master degree or higher in Computer Science, Computer Engineering, Information Technology, Management Information System or an IT related field.
- 3+ year's experience in Data Management or Data Engineer (Retail or E-Commerce business is preferable).
- Expert experience in query language (SQL), Databrick SQL, PostgreSQL.
- Experience in BigData Technologies like Hadoop, Apache Spark, Databrick.
- Experience in Python is a must.
- Knowledge in machine/statistical learning, data mining is a plus.
- Experience in Generative AI is a plus.
- Strong analytical, problem solving, communication and interpersonal skills.
- Having good attitude toward team working and willing to work hard.
- CP AXTRA | Lotus's
- CP AXTRA Public Company Limited.
- Nawamin Office: Buengkum, Bangkok 10230, Thailand.
- By applying for this position, you consent to the collection, use and disclosure of your personal data to us, our recruitment firms and all relevant third parties for the purpose of processing your application for this job position (or any other suitable positions within Lotus's and its subsidiaries, if any). You understand and acknowledge that your personal data will be processed in accordance with the law and our policy. .
ทักษะ:
Big Data, ETL, SQL
ประเภทงาน:
งานประจำ
เงินเดือน:
สามารถต่อรองได้
- Develop and maintain robust data pipelines to ingest, process, and transform raw data into formats suitable for LLM training.
- Conduct meeting with users to understand the data requirements and perform database design based on data understanding and requirements with consideration for performance.
- Maintain data dictionary, relationship and its interpretation.
- Analyze problem and find resolution, as well as work closely with administrators to monitor performance and advise any necessary infrastructure changes.
- Work with business domain experts, data scientists and application developers to identify data that is relevant for analysis.
- Develop big data solutions for batch processing and near real-time streaming.
- Own end-to-end data ETL/ELT process framework from Data Source to Data warehouse.
- Select and integrate appropriate tools and frameworks required to provide requested capabilities.
- Design and develop BI solutions.
- Hands-on development mentality, with a willingness to troubleshoot and solve complex problems.
- Keep abreast of new developments in the big data ecosystem and learn new technologies.
- Ability to effectively work independently and handle multiple priorities.
- Bachelor degree or higher in Computer Science, Computer Engineering, Information Technology, Management Information System or an IT related field.
- 3+ year's experiences in Data Management or Data Engineer (Retail or E-Commerce business is preferrable).
- Expert experience in query language (SQL), Databrick SQL, PostgreSQL.
- Experience in Big Data Technologies like Hadoop, Apache Spark, Databrick.
- Experience in Python is a must.
- Experience in Generative AI is a must.
- Knowledge in machine/statistical learning, data mining is a plus.
- Strong analytical, problem solving, communication and interpersonal skills.
- Having good attitude toward team working and willing to work hard.
- CP AXTRA | Lotus's
- CP AXTRA Public Company Limited.
- Nawamin Office: Buengkum, Bangkok 10230, Thailand.
- By applying for this position, you consent to the collection, use and disclosure of your personal data to us, our recruitment firms and all relevant third parties for the purpose of processing your application for this job position (or any other suitable positions within Lotus's and its subsidiaries, if any). You understand and acknowledge that your personal data will be processed in accordance with the law and our policy. .
ประสบการณ์:
ไม่จำเป็นต้องมีประสบการณ์ทำงาน
ทักษะ:
Mechanical Engineering, Electrical Engineering, English
ประเภทงาน:
งานประจำ
- Provide day to day installation, maintenance, and repair of all facilities in the data center.
- 24x7 shift work responsibility when qualified and designated.
- Provide requested reporting and documentation.
- Support of facility, development, and construction teams.
- Perform tasks as assigned by DC operation manager.
- Respond to customer requests, power, cooling, and facility audits.
- First tier investigate any power, communication, or cooling anomalies.
- Attend assigned meetings and training.
- Assist in ensuring customer compliance with GSA Acceptance Usage Policy (AUP).
- Provide technical escort when needed.
- Job Qualifications.
- Must be familiar with safety requirements and OSHA regulations or Thailand safety regulations.
- Basic understanding of electrical and mechanical systems that may be employed in a data center environment. This may include electrical feeders, transformers, generators, switchgear, UPS systems, DC power systems, ATS/STS units, PDU units, air handling units, cooling towers, and fire suppression systems.
- Familiar with Interpret wiring diagrams, schematics, and electrical drawings.
- Ability to express ideas clearly, concisely, and effectively with contractors performing maintenance or upgrades on systems installed in the data center environment.
- Excellent verbal, written, and interpersonal communication skills.
- Ability to analyze and make suggestions for problem resolution.
- Solve problems with good initiative and sound judgment.
- Creativity, problem solving skills, negotiation and systematic thinking.
- Fluent in English both written and verbal (Minimum 500 TOEIC score).
- Goal-Oriented, Unity, Learning, Flexible.
ทักษะ:
Automation, Product Owner, Python
ประเภทงาน:
งานประจำ
เงินเดือน:
สามารถต่อรองได้
- The candidate will be responsible for design, and implement new solutions for complex data ingestions from multiple sources to enterprise data products with focus on automation, performance, resilience and scalability, etc.
- Partner with Lead Architect, Data Product Manager (Product Owner) and Lead Data Integration Engineer to create strategic solutions introducing new technologies.
- Work with stakeholders including Management, Domain leads, and Teams to assist with data-related technical issues and support their data infrastructure needs.
- Strong development & programming experience in Informatica (IICS), Python, ADF, Azure Synapse, snowflake, Cosmos and Databricks.
- Solid understanding of databases, real-time integration patterns and ETL/ELT best practices.
- Defining data retention policies, monitoring performance and advising any necessary infrastructure changes based on functional and non-functional requirements.
- Responsible for ensuring enterprise data policies, best practices, standards and processes are followed.
- Write up and maintain technical specifications, design documents and process flow.
- Mentor a team of onshore and offshore development resources to analyze, design, construct and test software development projects focused on analytics and data integration.
- Elaborate user stories for technical team and ensure that the team understands the deliverables.
- Effectively communicate, coordinate & collaborate with business, IT architecture and data teams across multi-functional areas to complete deliverables.
- Provide direction to the Agile development team and stakeholders throughout the project.
- Assist in Data Architecture design, tool selection and data flows analysis.
- Work with large amounts of data, interpret data, analyze results, perform gap analysis and provide ongoing reports.
- Handle ad-hoc analysis & report generation requests from the business.
- Respond to data related inquiries to support business and technical teams.
- 6+years of proven working experience in ETL methodologies, Data integration and data migration. Informatica IICS, Databricks/Spark & Python hands-on development skills a must.
- Clear hands-on experience with database systems - SQL server, Oracle, Azure Synapse, Snowflake and Cosmos, Cloud technologies (e.g., AWS, Azure, Google), and NoSQL databases (e.g., Cosmos, MongoDB, DynamoDB).
- Extensive experience developing complex solutions focused on data ecosystem solutions.
- Extensive knowledge of data and analytics framework supporting data lakes, warehouses, marts, reporting, etc.
- In depth knowledge of data engineering and architecture disciplines.
- Extensive experience working with Big Data tools and building data solutions for advanced analytics.
- Solid understanding of P&C Insurance data.
- Technical expertise regarding data architecture, models and database design development.
- Strong knowledge of and experience with Java, SQL, XML s, Python, ETL frameworks and Databricks.
- Working knowledge/familiarity with Git version control.
- Strong Knowledge of analyzing datasets using Excel.
- Strong analytical skills with the ability to collect, organize, analyze, and disseminate significant amounts of information with attention to detail and accuracy.
- Proficient in learning new technologies with the ability to quickly understand capabilities and work with others to guide these into development.
- Good communication and presentation skills.
- Solid problem solving, decision making and analytical skills.
- Knowledge & working experience with Duck creek is an added plus.
- Knowledge & working experience with Insurity Policy Decisions and/or IEV is an added plus.
- Experience with JIRA.
- Experience being part of high-performance agile teams in a fast-paced environment.
- Must understand the system scope and project objectives to achieve project needs through matrix management and collaboration with other enterprise teams.
- Proven ability to produce results in the analysis, design, testing and deployment of applications.
- Strong team emphasis and relationship building skills; partners well with business and other IT/Data areas.
- Strong coaching / mentoring skills.
- Applies technical knowledge to determine solutions and solve complex problems.
- Ability to be proactive, self-motivated, detail-oriented, creative, inquisitive and persistent.
- Excellent communication and negotiation skills.
- Ability to organize, plan and implement work assignments, juggle competing demands and work under pressure of frequent and tight deadlines.
- The candidate will be responsible for design, and implement new solutions for complex data ingestions from multiple sources to enterprise data products with focus on automation, performance, resilience and scalability, etc.
- Partner with Lead Architect, Data Product Manager (Product Owner) and Lead Data Integration Engineer to create strategic solutions introducing new technologies.
- Work with stakeholders including Management, Domain leads, and Teams to assist with data-related technical issues and support their data infrastructure needs.
- Strong development & programming experience in Informatica (IICS), Python, ADF, Azure Synapse, snowflake, Cosmos and Databricks.
- Solid understanding of databases, real-time integration patterns and ETL/ELT best practices.
- Defining data retention policies, monitoring performance and advising any necessary infrastructure changes based on functional and non-functional requirements.
- Responsible for ensuring enterprise data policies, best practices, standards and processes are followed.
- Write up and maintain technical specifications, design documents and process flow.
- Mentor a team of onshore and offshore development resources to analyze, design, construct and test software development projects focused on analytics and data integration.
- Elaborate user stories for technical team and ensure that the team understands the deliverables.
- Effectively communicate, coordinate & collaborate with business, IT architecture and data teams across multi-functional areas to complete deliverables.
- Provide direction to the Agile development team and stakeholders throughout the project.
- Assist in Data Architecture design, tool selection and data flows analysis.
- Work with large amounts of data, interpret data, analyze results, perform gap analysis and provide ongoing reports.
- Handle ad-hoc analysis & report generation requests from the business.
- Respond to data related inquiries to support business and technical teams.
- 6+years of proven working experience in ETL methodologies, Data integration and data migration. Informatica IICS, Databricks/Spark & Python hands-on development skills a must.
- Clear hands-on experience with database systems - SQL server, Oracle, Azure Synapse, Snowflake and Cosmos, Cloud technologies (e.g., AWS, Azure, Google), and NoSQL databases (e.g., Cosmos, MongoDB, DynamoDB).
- Extensive experience developing complex solutions focused on data ecosystem solutions.
- Extensive knowledge of data and analytics framework supporting data lakes, warehouses, marts, reporting, etc.
- In depth knowledge of data engineering and architecture disciplines.
- Extensive experience working with Big Data tools and building data solutions for advanced analytics.
- Solid understanding of P&C Insurance data.
- Technical expertise regarding data architecture, models and database design development.
- Strong knowledge of and experience with Java, SQL, XML s, Python, ETL frameworks and Databricks.
- Working knowledge/familiarity with Git version control.
- Strong Knowledge of analyzing datasets using Excel.
- Strong analytical skills with the ability to collect, organize, analyze, and disseminate significant amounts of information with attention to detail and accuracy.
- Proficient in learning new technologies with the ability to quickly understand capabilities and work with others to guide these into development.
- Good communication and presentation skills.
- Solid problem solving, decision making and analytical skills.
- Knowledge & working experience with Duck creek is an added plus.
- Knowledge & working experience with Insurity Policy Decisions and/or IEV is an added plus.
- Experience with JIRA.
- Experience being part of high-performance agile teams in a fast-paced environment.
- Must understand the system scope and project objectives to achieve project needs through matrix management and collaboration with other enterprise teams.
- Proven ability to produce results in the analysis, design, testing and deployment of applications.
- Strong team emphasis and relationship building skills; partners well with business and other IT/Data areas.
- Strong coaching / mentoring skills.
- Applies technical knowledge to determine solutions and solve complex problems.
- Ability to be proactive, self-motivated, detail-oriented, creative, inquisitive and persistent.
- Excellent communication and negotiation skills.
- Ability to organize, plan and implement work assignments, juggle competing demands and work under pressure of frequent and tight deadlines.
ทักษะ:
ETL, Python, Java
ประเภทงาน:
งานประจำ
เงินเดือน:
สามารถต่อรองได้
- Design, develop, and maintain scalable data pipelines and ETL processes.
- Implement and optimize data storage solutions, including data warehouses and data lakes.
- Collaborate with data scientists and analysts to understand data requirements and provide efficient data access.
- Ensure data quality, consistency, and reliability across all data systems.
- Develop and maintain data models and schemas.
- Implement data security and access control measures.
- Optimize query performance and data retrieval processes.
- Evaluate and integrate new data technologies and tools.
- Mentor junior data engineers and provide technical leadership.
- Collaborate with cross-functional teams to support data-driven decision-making.
- RequirementsBachelor's or Master's degree in Computer Science, Engineering, or a related field.
- 5+ years of experience in data engineering or related roles.
- Strong programming skills in Python, Java, or Scala.
- Extensive experience with big data technologies such as Hadoop, Spark, and Hive.
- Proficiency in SQL and experience with both relational and NoSQL databases.
- Experience with cloud platforms (AWS, Azure, or GCP) and their data services.
- Knowledge of data modeling, data warehousing, and ETL best practices.
- Familiarity with data visualization tools (e.g., Tableau, Power BI).
- Experience with version control systems (e.g., Git) and CI/CD pipelines.
- Strong problem-solving skills and attention to detail.
- Excellent communication and collaboration skills.
ประเภทงาน:
งานประจำ
เงินเดือน:
สามารถต่อรองได้
- Analyze data and provide business insight.
- Assess risk and design mitigation actions based on data insight.
- Design dashboards / predictive models for decision-making management.
- Lead and Supervise team by projects.
ประสบการณ์:
3 ปีขึ้นไป
ทักษะ:
Compliance, Legal, Risk Management
ประเภทงาน:
งานประจำ
เงินเดือน:
สามารถต่อรองได้
- Develop data security policy review, data security policy exceptions, and control risk mitigation processes.
- Define the security controls for access management lifecycle (i.e., requirement for creation, deletion, transfer and review).
- Operate:Advice on technology relating to Data Privacy and Protection (i.e., PDPA) related security controls implementation.
- Drive and support data security controls such as Data Loss Prevention (DLP), Data Masking, Data Encryption capabilities to protect sensitive data.
- Drive compliance (or collaborate with compliance team) to organization security policies, standards, metrics, and legal requirements.
- Communicate and enforce security policies, rules, and standards.
- Conduct impact assessment of data initiatives from a security point of view.
- Ensure the cryptographic keys and related components are safety and protection of confidential information.
- Resolve data security audit and risk findings.
- Review and develop security controls to current access controls policies and procedures.
- Provide requirements for create and manage roles, access rights (includes privileged access), authentication and identity within the environment.
- Conduct periodic review of user access.
- Review, approve and monitor the usage of privileged access.
- EDUCATION.
- Bachelor s degree in computer science, Information Systems, or equivalent education or work experience.
- EXPERIENCE.
- Work experience in privacy, compliance, information security, auditing or a related field may also be an accepted alternative, according to Cybersecurity.
- Minimum 3 years of experience in and strong knowledge of privacy, data, operational risk management, information security, or related areas in IT.
- OTHER REQUIREMENTS.
ทักษะ:
Data Analysis, English
ประเภทงาน:
งานประจำ
เงินเดือน:
สามารถต่อรองได้
- Data Innovation Team.
- ปฎิบัติงานยัง บริษัท Infinitas by Krungthai
- As a data analyst in Infinitas, you would expect a new level of analytical experiences here. Being part of Data Innovation team, your goal is to make data useful for all stakeholders. Under the big umbrella of the leading bank in Thailand, Infinitas leverage the largest financial data sets from both traditional banking and mobile banking services. From customer digital footprint to branch, ATM and call center. We mea ...
- Job Responsibilities
- Conduct data inventory research with product owner, business owner and IT BA to gain full understandings of data availability.
- Communicate with business owners to translate business problem/challenge into actionable analytical solution
- Initiate EDA ideas to tag hidden opportunities for customer, product, channel and other various areas.
- Analyze digital and traditional user journey funnel and customer persona
- Visualize data for fast decision making and insight interpretation
- Define customer segmentations for strategy planning and marketing targeting
- Plan holistic A/B testing campaigns to evaluate data values on business impact
- Design and fulfill monitoring dashboards and automated reports
- English as working language
- Minimum of 3 years data analytics related working experiences
- At least 1 year of working experience directly communicate to business team
- Proficient in Python or SQL
- Advanced hands on experiences with visualization tool
- Strong communication and analytical thinking skills
- Good balance of data and business knowledge
- Fintech or banking industry
- Internet companies with mobile application.
- ท่านสามารถอ่านและศึกษานโยบายความเป็นส่วนตัวของธนาคารกรุงไทย จำกัด (มหาชน) ที่ https://krungthai.com/th/content/privacy-policy ทั้งนี้ ธนาคารไม่มีเจตนาหรือความจำเป็นใดๆ ที่จะประมวลผลข้อมูลส่วนบุคคลที่มีความอ่อนไหว รวมถึงข้อมูลที่เกี่ยวข้องศาสนาและ/หรือหมู่โลหิต ซึ่งอาจปรากฏอยู่ในสำเนาบัตรประจำตัวประชาชนของท่านแต่อย่างใด ดังนั้น กรุณาอย่าอัปโหลดเอกสารใดๆ รวมถึงสำเนาบัตรประจำตัวประชาชน หรือกรอกข้อมูลส่วนบุคคลที่มีความอ่อนไหวหรือข้อมูลอื่นใด ซึ่งไม่เกี่ยวข้องหรือไม่จำเป็นสำหรับวัตถุประสงค์ในการสมัครงานไว้บนเว็บไซต์ นอกจากนี้ กรุณาดำเนินการให้แน่ใจว่าได้ดำเนินการลบข้อมูลส่วนบุคคลที่มีความอ่อนไหว (ถ้ามี) ออกจากเรซูเม่และเอกสารอื่นใดก่อนที่จะอัปโหลดเอกสารดังกล่าวไว้บนเว็บไซต์แล้วด้วย ทั้งนี้ ธนาคารมีความจำเป็นต้องเก็บรวบรวมข้อมูลส่วนบุคคลเกี่ยวกับประวัติอาชญากรรมของท่านเพื่อบรรลุวัตถุประสงค์ในการพิจารณารับบุคคลเข้าทำงาน หรือการตรวจสอบคุณสมบัติ ลักษณะต้องห้าม หรือพิจารณาความเหมาะสมของบุคคลที่จะให้ดำรงตำแหน่ง ซึ่งการให้ความยินยอมเพื่อเก็บรวบรวม ใช้ หรือเปิดเผยข้อมูลส่วนบุคคลเกี่ยวกับประวัติอาชญากรรมของท่านมีความจำเป็นสำหรับการเข้าทำสัญญาและการได้รับการพิจารณาตามวัตถุประสงค์ดังกล่าวข้างต้น ในกรณีที่ท่านไม่ให้ความยินยอมในการเก็บรวบรวม ใช้ หรือเปิดเผยข้อมูลส่วนบุคคลเกี่ยวกับประวัติอาชญากรรม หรือมีการถอนความยินยอมในภายหลัง ธนาคารอาจไม่สามารถดำเนินการเพื่อบรรลุวัตถุประสงค์ดังกล่าวข้างต้นได้ และอาจ ทำให้ท่านสูญเสียโอกาสในการได้รับการพิจารณารับเข้าทำงานกับธนาคาร .
ทักษะ:
Excel, Python, Power BI
ประเภทงาน:
งานประจำ
เงินเดือน:
สามารถต่อรองได้
- Create, Develop and Monitor Auto Replenishment & Parameter.
- Maintain and adjust parameters to optimize stock availability/ stock level during normal/month and promotion periods.
- Investigate and identify root cause of overstocking and OOS at Store/DC.
- Monitoring of target stock on normal/seasonal period to suit with business sale target.
- Adjust daily sales in system to correct average daily sales after promotion period.
- Forecasting demand in each promotion campaign to manage Parameter setting.
- Develop Daily KPI Dashboard to monitor sales performance VS Suggest number from system.
- Bachelor Degree of Supply Chain, Logistic, Economics, Mathematic and other relate filed.
- Have experience in Data analyst, Inventory Analyst, Inventory Planning at least 3-5 Years.
- Strong Mathematic skills is a must.
- Excellent for Excel (Pivot, VLOOKUP, VBA), Python, Power BI, Tableau.
- Have experience in Retail business /FMCG would be advantage.
- Good Analytic skills.
ทักษะ:
Excel, Meet Deadlines, Power BI
ประเภทงาน:
งานประจำ
เงินเดือน:
สามารถต่อรองได้
- Collect HR Data from a variety of data sources, including HRMS, Excel, Google Sheet and Text files.
- Preparing data and developing analytical reports in the HR area, such as HR Dashboard, to support executive decision-making.
- Analyzes and prepares Executive Dashboard in order to predict and find solutions with HR team member.
- Support the HR team by providing insightful information to support HR strategies.
- Bachelor s Degree in Business Administration or a related field.
- Minimum of 2 years of experience in an HR Data Analyst or HRIS role.
- Ability to manage multiple tasks/projects, work under pressure, and meet deadlines.
- Strong verbal and written communication skills, with excellent presentation abilities.
- Results-driven and solution-oriented.
- Experience with data visualization and analytics platforms such as Microsoft Power BI or Tableau.
- Proficiency in SQL, including platforms like PostgreSQL and Oracle DB.
ทักษะ:
Excel, Python, Power BI
ประเภทงาน:
งานประจำ
เงินเดือน:
สามารถต่อรองได้
- Create, Develop and Monitor Auto Replenishment & Parameter.
- Maintain and adjust parameters to optimize stock availability/ stock level during normal/month and promotion periods.
- Investigate and identify root cause of overstocking and OOS at Store/DC.
- Monitoring of target stock on normal/seasonal period to suit with business sale target.
- Adjust daily sales in system to correct average daily sales after promotion period.
- Forecasting demand in each promotion campaign to manage Parameter setting.
- Develop Daily KPI Dashboard to monitor sales performance VS Suggest number from system.
- Bachelor Degree of Supply Chain, Logistic, Economics, Mathematic and other relate filed.
- Have experience in Data analyst, Inventory Analyst, Inventory Planning at least 3-5 Years.
- Have experience in Retail business /FMCG would be advantage.
- Excellent for Excel (Pivot, VLOOKUP, VBA), Python, Power BI, Tableau.
- Good Analytic skills.
ทักษะ:
Compliance, Data Analysis, Power BI
ประเภทงาน:
งานประจำ
เงินเดือน:
สามารถต่อรองได้
- Build and maintain an HR data repository tailored to the food business under ThaiBev group, focusing on metrics critical to food operations, such as labor productivity, turnover by location, and shift coverage efficiency.
- Ensure data integrity and compliance with industry-specific labor regulations, maintaining a transparent and accurate source of HR information.
- Collaborate with operations teams to integrate labor data from multiple food business units, enabling holistic insights across various branches and regions.
- Assist HR Line Manager on Strategic HR Analytics for Workforce OptimizationConduct data analysis on staffing patterns, turnover rates, and workforce efficiency to identify optimization opportunities aligned with food business cycles.
- Use predictive analytics to anticipate workforce needs for peak and off-peak seasons, aiding in proactive staffing and cost control with operation team to centralization.
- Assist on Commercial Structure and Labor Cost Management for Food OperationsAnalyze labor costs relative to revenue and operational efficiency within different food outlets, providing insights to optimize staffing while maximizing profitability.
- Support the development of labor cost budgets that align with product pricing and sales targets in the food sector, helping maintain competitive yet profitable operations.
- Generate regular reports on labor cost performance against targets, identifying areas for improvement and enabling business leaders to adjust strategy as needed.
- Be Leader on developing Power BI Development for Real-Time Food Business InsightsDesign and deploy Power BI dashboards specific to food operations, offering real-time insights on key metrics like labor costs, staffing levels, and turnover rates across outlets.
- Collaborate with senior leaders in the food division to customize dashboards, highlighting KPIs that impact food production, service speed, and customer satisfaction.
- Continuously update Power BI capabilities to provide comprehensive, up-to-date views on HR metrics essential to food business strategy.
- 3+ years of experience in analytics, data management not specific in HR experience.
- Demonstrated proficiency in Power BI development and advanced Excel skills, including VBA, macros, and pivot tables.
- Prior experience in labor cost analysis, commercial structure evaluation.
- Contact Information:-.
- Oishi Group Public Company Limited.
- CW Tower, No.90. Ratchadapisek Road, Huai Khwang, Bangkok.
ทักษะ:
ETL, Data Analysis, Industry trends
ประเภทงาน:
งานประจำ
เงินเดือน:
฿70,000 - ฿90,000, สามารถต่อรองได้
- Analyze and interpret complex data sets to uncover insights and trends that drive business strategy and decision-making.
- Collaborate with cross-functional teams to understand their data needs and provide actionable recommendations.
- Design and maintain dashboards, reports, and visualizations using tools to communicate insights effectively.
- Extract data from various sources, including databases, APIs, and third-party services, ensuring data quality and accuracy.
- Develop and implement data models, ETL processes, and automated reporting solutions to streamline data analysis.
- Stay updated with industry trends and new technologies to enhance the company's data analytics capabilities.
- Participate in data governance initiatives, ensuring compliance with data privacy and security regulations.
- Requirements/Qualifications(must have):.
- Bachelor's degree in Statistics, Data Science, or a related field; an MBA or advanced degree is a plus.
- Minimum of 5 years of experience in business intelligence or data analysis, preferably in a fast-paced e-commerce environment.
- Proficient in SQL and at least one data visualization tool (e.g., Tableau, Power BI), with a solid understanding of data warehousing concepts.
- Strong analytical skills, with the ability to manipulate, clean, and derive insights from large datasets.
- Effective communicator with excellent presentation skills, capable of translating complex data into simple, actionable insights for non-technical stakeholders.
ทักษะ:
Power BI, Excel, Microsoft Office
ประเภทงาน:
งานประจำ
เงินเดือน:
สามารถต่อรองได้
- วิเคราะห์ และจัดทำรายงานเปรียบเทียบข้อมูล ต่างๆ เช่น รายวัน, รายเดือน, รายปี โดยเปรียบเทียบกับข้อมูลในอดีตเป้าหมาย และการประมาณการในอนาคต เพื่อนำเสนอผู้บริหาร.
- จัดทำงบประมาณประจำปีในส่วนของ ยอดขาย และค่าใช้จ่ายส่งเสริมการขาย.
- จัดทำรายงานสรุป Profit & Loss ประจำเดือนในมุมมองบริหารฯ เพื่อนำเสนอผู้บริหาร.
- จัดทำฐานข้อมูลตามแนวทางต่างๆ เพื่อสนับสนุน การคำนวณ/การทำรายงานของส่วนงานอื่นๆ ที่เกี่ยวข้อง เช่น เป้าหมายการจ่ายเงินจูงใจให้กับพนักงาน, เป้าหมายการจ่ายเงินรางวัล เป็นต้น.
- ออกแบบ Dashboard สำหรับข้อมูลด้านต่างๆ ที่เกี่ยวข้อง ให้ผู้ใช้งานเข้าใจ และสามารถนำไปใช้ได้ง่าย เช่น Power BI.
- ออกแบบ Template ให้ส่วนงานต่างๆที่เกี่ยวข้อง เพื่อรองรับการทำงานในการรวบรวมข้อมูล เพื่อช่วยให้การทำงาน ได้ผลลัพธ์ที่รวดเร็วขึ้น.
- วิเคราะห์ข้อมูลอื่น ๆ ที่เกี่ยวข้อง ตามที่ได้รับมอบหมาย.
- ให้คำแนะนำในกาวิเคราะห์ข้อมูล แก่หน่วยงาน ต่าง ๆ ที่มีการใช้งานข้อมูล.
- งานอื่น ๆ ที่ได้รับมอบหมาย.
- มีทักษะในการใช้คอมพิวเตอร์โปรแกรม MS Excelขั้นสูง.
- มีทักษะการใช้คอมพิวเตอร์อื่นๆ ได้เป็นอย่างดี: Microsoft Office; Word, Power Point.
- มีความสามารถทางการวิเคราะห์ การวางแผน และการจัดการอย่างเป็นระบบ และมีมาตรฐานในการทำงาน.
- มีทักษะในการสื่อสาร และความสามารถ ในการเจรจา ต่อรอง.
- มีความสามารถในการเรียนรู้สิ่งใหม่ ๆ ได้รวดเร็ว.
- มีทักษะในการนำเสนองาน.
- มีทักษะในการใช้ Power BI (ถ้ามี).
- มีความรู้พื้นฐานด้านการเขียนโปรแกรมต่าง ๆ (ถ้ามี).
- ติดต่อสอบถาม.
- บริษัท โมเดิร์นเทรด แมนเนจเม้นท์ จำกัด.
- อาคารเล้าเป้งง้วน 1 ชั้น 26 ถนนวิภาวดีรัสิต แขวงจอมพล เขตจตุจักร แขวงจอมพล เขตจตุจักร จังหวัดกรุงเทพมหานคร.
ทักษะ:
SQL, Research, Java
ประเภทงาน:
งานประจำ
เงินเดือน:
สามารถต่อรองได้
- Background in SQL, databases and/or data science OR.
- BS/MS in software engineering, computer science, mathematics.
- Document data sources in enterprise data catalog with metadata, lineage and classification information.
- Develop aggregations and algorithms needed for reporting and analytics with low level complexity.
- Implement minor changes to existing data visualization applications, reporting dashboards.
- Document modifications to reporting applications based on modifications applied.
- Comprehend and adhere to all data security policies and procedures.
- Create data tools for analytics and data scientist team members.
- Build analytical tools to provide actionable insights into key business KPIs, etc.
- Work with data engineers to optimize pipelines for scalability and data delivery.
- Functional Competency.
- Working knowledge with data and analytics framework supporting data lakes, warehouses, marts, reporting, etc.
- Experience with data tools for visualizations, analytics and reporting.
- Strong analytical skills with ability to research, assess and develop observations/findings.
- Ability to communicate findings, approaches to cross functional teams and stakeholders.
- 3+ years' hands-on experience with a data science background.
- Some programming skills in Java, Python and SQL.
- Clear hands-on experience with database systems - Cloud technologies (e.g. AWS, Azure, Google), in-memory database systems (e.g. HANA, Hazel cast, etc) and other database systems - traditional RDBMS (e.g. Teradata, SQL Server, Oracle), and NoSQL databases (e.g. Cassandra, MongoDB, DynamoDB).
- Educational.
- Background in SQL, databases and/or data science OR.
- BS/MS in software engineering, computer science, mathematics.
- Document data sources in enterprise data catalog with metadata, lineage and classification information.
- Develop aggregations and algorithms needed for reporting and analytics with low level complexity.
- Implement minor changes to existing data visualization applications, reporting dashboards.
- Document modifications to reporting applications based on modifications applied.
- Comprehend and adhere to all data security policies and procedures.
- Create data tools for analytics and data scientist team members.
- Build analytical tools to provide actionable insights into key business KPIs, etc.
- Work with data engineers to optimize pipelines for scalability and data delivery.
- Functional Competency.
- Working knowledge with data and analytics framework supporting data lakes, warehouses, marts, reporting, etc.
- Experience with data tools for visualizations, analytics and reporting.
- Strong analytical skills with ability to research, assess and develop observations/findings.
- Ability to communicate findings, approaches to cross functional teams and stakeholders.
- 3+ years' hands-on experience with a data science background.
- Some programming skills in Java, Python and SQL.
- Clear hands-on experience with database systems - Cloud technologies (e.g. AWS, Azure, Google), in-memory database systems (e.g. HANA, Hazel cast, etc) and other database systems - traditional RDBMS (e.g. Teradata, SQL Server, Oracle), and NoSQL databases (e.g. Cassandra, MongoDB, DynamoDB).
ทักษะ:
Finance, Compliance
ประเภทงาน:
งานประจำ
เงินเดือน:
สามารถต่อรองได้
- Conduct detailed analysis of Enterprise Service revenue to identify trends in products and services within AIS Group.
- Verify the accuracy and completeness of revenue collection, promotion packages, and new services to ensure compliance with business conditions.
- Develop appropriate QA measures to minimize revenue loss and operational errors.
- Detect and investigate irregularities affecting revenue, such as real loss, opportunity loss, and fraud.
- Collaborate with relevant departments to address and rectify issues impacting revenue.
- Ensure the accuracy of service charges, promotion packages, and offerings for enterprise customers.
- Review and validate the calculation of postpaid voice, IDD, and IR services in the RBM system to prevent revenue loss.
- Utilize data analytics skills to analyze data from various sources, reflecting trends, performance, and efficiency of products and services.
- Prepare analysis reports to support management in strategy formulation and risk assessment.
ทักษะ:
Data Analysis, Excel, Power BI
ประเภทงาน:
งานประจำ
เงินเดือน:
สามารถต่อรองได้
- Graduate with a Bachelor's/Master's Degree in Economics, Engineer, IT, or other related fields.
- Have an Experience in data analysis or performance reporting.
- Strong data analysis skills using Excel, Power BI, and SQL.
- Able to use VBA/Macro will be given special consideration.
- Experience in Retail businesses will be given special consideration..
- Tasks & responsibilities.
- Prepare and deliver daily, weekly, and monthly business reports to key stakeholders and business controllers.
- Recommend IT solutions for optimizing report generation across Financial, Commercial, and Operational areas.
- Ensure accuracy and consistency in data, readying it for comprehensive reporting.
- Provide data support to internal teams to meet reporting needs.
- Prepare additional ad hoc reports as assigned..
- 1
- 2
- 3
- 4
- 5
- 6
- 11