- No elements found. Consider changing the search query.
Experience:
5 years required
Skills:
Python, ETL, Compliance
Job type:
Full-time
Salary:
negotiable
- Design and implement scalable, reliable, and efficient data pipelines for ingesting, processing, and storing large amounts of data from a variety of sources using cloud-based technologies, Python, and PySpark.
- Build and maintain data lakes, data warehouses, and other data storage and processing systems on the cloud.
- Write and maintain ETL/ELT jobs and data integration scripts to ensure smooth and accurate data flow.
- Implement data security and compliance measures to protect data privacy and ensure regulatory compliance.
- Collaborate with data scientists and analysts to understand their data needs and provide them with access to the required data.
- Stay up-to-date on the latest developments in cloud-based data engineering, particularly in the context of Azure, AWS and GCP, and proactively bring new ideas and technologies to the team.
- Monitor and optimize the performance of data pipelines and systems, identifying and resolving any issues or bottlenecks that may arise.
- Bachelor s or Master s degree in Computer Science, Data Science, or a related field.
- Minimum of 5 years of experience as a Data Engineer, with a strong focus on cloud-based data infrastructure.
- Proficient programming skills in Python, Java, or a similar language, with an emphasis on Python.
- Extensive experience with cloud-based data storage and processing technologies, particularly Azure, AWS and GCP.
- Familiarity with ETL/ELT tools and frameworks such as Apache Beam, Apache Spark, or Apache Flink.
- Knowledge of data modeling principles and experience working with SQL databases.
- Strong problem-solving skills and the ability to troubleshoot and resolve issues efficiently.
- Excellent communication and collaboration skills to work effectively with cross-functional teams.
- Location: True Digital Park, Bangkok (Hybrid working).
Skills:
ETL, Java, Python
Job type:
Full-time
Salary:
negotiable
- Design, develop, optimize, and maintain data architecture and pipelines that adhere to ETL principles and business goals.
- Create data products for analytics and data scientist team members to improve their productivity.
- Lead the evaluation, implementation and deployment of emerging tools and process for analytic data engineering in order to improve our productivity as a team.
- Develop and deliver communication and education plans on analytic data engineering capabilities, standards, and processes.
- RequirementsPrevious experience as a data engineer or in a similar role.
- Technical expertise with data models, data mining, and segmentation techniques.
- Knowledge of programming languages (e.g. Java and Python).
- Hands-on experience with SQL database design using Hadoop or BigQuery and experience with a variety of relational, NoSQL, and cloud database technologies.
- Great numerical and analytical skill.
- Worked with BI tools such as Tableau, Power BI.
- Conceptual knowledge of data and analytics, such as dimensional modeling, ETL, reporting tools, data governance, data warehousing, structured and unstructured data.
Skills:
ETL, Python, Java
Job type:
Full-time
Salary:
negotiable
- Design, develop, and maintain scalable data pipelines and ETL processes.
- Implement and optimize data storage solutions, including data warehouses and data lakes.
- Collaborate with data scientists and analysts to understand data requirements and provide efficient data access.
- Ensure data quality, consistency, and reliability across all data systems.
- Develop and maintain data models and schemas.
- Implement data security and access control measures.
- Optimize query performance and data retrieval processes.
- Evaluate and integrate new data technologies and tools.
- Mentor junior data engineers and provide technical leadership.
- Collaborate with cross-functional teams to support data-driven decision-making.
- RequirementsBachelor's or Master's degree in Computer Science, Engineering, or a related field.
- 5+ years of experience in data engineering or related roles.
- Strong programming skills in Python, Java, or Scala.
- Extensive experience with big data technologies such as Hadoop, Spark, and Hive.
- Proficiency in SQL and experience with both relational and NoSQL databases.
- Experience with cloud platforms (AWS, Azure, or GCP) and their data services.
- Knowledge of data modeling, data warehousing, and ETL best practices.
- Familiarity with data visualization tools (e.g., Tableau, Power BI).
- Experience with version control systems (e.g., Git) and CI/CD pipelines.
- Strong problem-solving skills and attention to detail.
- Excellent communication and collaboration skills.
Experience:
3 years required
Skills:
Big Data, Hive, SAS
Job type:
Full-time
Salary:
negotiable
- Design, implement, and maintain data analytics pipelines and processing systems.
- Experience of data modelling techniques and integration patterns.
- Write data transformation jobs through code.
- Analyze large datasets to extract insights and identify trends.
- Perform data management through data quality tests, monitoring, cataloging, and governance.
- Knowledge in data infrastructure ecosystem.
- Collaborate with cross-functional teams to identify opportunities to leverage data to drive business outcomes.
- Build data visualizations to communicate findings to stakeholders.
- A willingness to learn and find solutions to complex problems.
- Stay up-to-date with the latest developments in data analytics and science.
- Experience migrating from on-premise data stores to cloud solutions.
- Knowledge of system design and platform thinking to build sustainable solution.
- Practical experience with modern and traditional Big Data stacks (e.g BigQuery, Spark, Databricks, duckDB, Impala, Hive, etc).
- Experience working with data warehouse solutions ELT solutions, tools, and techniques (e.g. Airflow, dbt, SAS, Matillion, Nifi).
- Experience with agile software delivery and CI/CD processes.
- Bachelor's or Master's degree in computer science, statistics, engineering, or a related field.
- At least 3 years of experience in data analysis and modeling.
- Proficiency in Python, and SQL.
- Experience with data visualization tools such as Tableau, Grafana or similar.
- Familiarity with cloud computing platforms, such as GCP, AWS or Databricks.
- Strong problem-solving skills and the ability to work independently as well as collaboratively.
- This role offers a clear path to advance into machine learning and AI with data quality and management, providing opportunities to work on innovative projects and develop new skills in these exciting fields..
- Contact: [email protected] (K.Thipwimon).
- ท่านสามารถอ่านและศึกษานโยบายความเป็นส่วนตัวของธนาคารกรุงไทย จำกัด (มหาชน) ที่ https://krungthai.com/th/content/privacy-policy ทั้งนี้ ธนาคารไม่มีเจตนาหรือความจำเป็นใดๆ ที่จะประมวลผลข้อมูลส่วนบุคคลที่มีความอ่อนไหว รวมถึงข้อมูลที่เกี่ยวข้องศาสนาและ/หรือหมู่โลหิต ซึ่งอาจปรากฏอยู่ในสำเนาบัตรประจำตัวประชาชนของท่านแต่อย่างใด ดังนั้น กรุณาอย่าอัปโหลดเอกสารใดๆ รวมถึงสำเนาบัตรประจำตัวประชาชน หรือกรอกข้อมูลส่วนบุคคลที่มีความอ่อนไหวหรือข้อมูลอื่นใด ซึ่งไม่เกี่ยวข้องหรือไม่จำเป็นสำหรับวัตถุประสงค์ในการสมัครงานไว้บนเว็บไซต์ นอกจากนี้ กรุณาดำเนินการให้แน่ใจว่าได้ดำเนินการลบข้อมูลส่วนบุคคลที่มีความอ่อนไหว (ถ้ามี) ออกจากเรซูเม่และเอกสารอื่นใดก่อนที่จะอัปโหลดเอกสารดังกล่าวไว้บนเว็บไซต์แล้วด้วย ทั้งนี้ ธนาคารมีความจำเป็นต้องเก็บรวบรวมข้อมูลส่วนบุคคลเกี่ยวกับประวัติอาชญากรรมของท่านเพื่อบรรลุวัตถุประสงค์ในการพิจารณารับบุคคลเข้าทำงาน หรือการตรวจสอบคุณสมบัติ ลักษณะต้องห้าม หรือพิจารณาความเหมาะสมของบุคคลที่จะให้ดำรงตำแหน่ง ซึ่งการให้ความยินยอมเพื่อเก็บรวบรวม ใช้ หรือเปิดเผยข้อมูลส่วนบุคคลเกี่ยวกับประวัติอาชญากรรมของท่านมีความจำเป็นสำหรับการเข้าทำสัญญาและการได้รับการพิจารณาตามวัตถุประสงค์ดังกล่าวข้างต้น ในกรณีที่ท่านไม่ให้ความยินยอมในการเก็บรวบรวม ใช้ หรือเปิดเผยข้อมูลส่วนบุคคลเกี่ยวกับประวัติอาชญากรรม หรือมีการถอนความยินยอมในภายหลัง ธนาคารอาจไม่สามารถดำเนินการเพื่อบรรลุวัตถุประสงค์ดังกล่าวข้างต้นได้ และอาจ ทำให้ท่านสูญเสียโอกาสในการได้รับการพิจารณารับเข้าทำงานกับธนาคาร .
Skills:
ETL, SQL, Hadoop
Job type:
Full-time
Salary:
negotiable
- Conduct meeting with users to understand the data requirements and perform database design based on data understanding and requirements with consideration for performance.
- Maintain data dictionary, relationship and its interpretation.
- Analyze problem and find resolution, as well as work closely with administrators to monitor performance and advise any necessary infrastructure changes.
- Work with business domain experts, data scientists and application developers to identify data that is relevant for analysis.
- Develop big data solutions for batch processing and near real-time streaming.
- Own end-to-end data ETL/ELT process framework from Data Source to Data warehouse.
- Select and integrate appropriate tools and frameworks required to provide requested capabilities.
- Design and develop BI solutions.
- Hands-on development mentality, with a willingness to troubleshoot and solve complex problems.
- Keep abreast of new developments in the big data ecosystem and learn new technologies.
- Ability to effectively work independently and handle multiple priorities.
- Master degree or higher in Computer Science, Computer Engineering, Information Technology, Management Information System or an IT related field.
- 3+ year's experience in Data Management or Data Engineer (Retail or E-Commerce business is preferable).
- Expert experience in query language (SQL), Databrick SQL, PostgreSQL.
- Experience in BigData Technologies like Hadoop, Apache Spark, Databrick.
- Experience in Python is a must.
- Knowledge in machine/statistical learning, data mining is a plus.
- Experience in Generative AI is a plus.
- Strong analytical, problem solving, communication and interpersonal skills.
- Having good attitude toward team working and willing to work hard.
- CP AXTRA | Lotus's
- CP AXTRA Public Company Limited.
- Nawamin Office: Buengkum, Bangkok 10230, Thailand.
- By applying for this position, you consent to the collection, use and disclosure of your personal data to us, our recruitment firms and all relevant third parties for the purpose of processing your application for this job position (or any other suitable positions within Lotus's and its subsidiaries, if any). You understand and acknowledge that your personal data will be processed in accordance with the law and our policy. .
Experience:
2 years required
Skills:
Research, Python, SQL
Job type:
Full-time
Salary:
negotiable
- Develop machine learning models such as credit model, income estimation model and fraud model.
- Research on cutting-edge technology to enhance existing model performance.
- Explore and conduct feature engineering on existing data set (telco data, retail store data, loan approval data).
- Develop sentimental analysis model in order to support collection strategy.
- Bachelor Degree in Computer Science, Operations Research, Engineering, or related quantitative discipline.
- 2-5 years of experiences in programming languages such as Python, SQL or Scala.
- 5+ years of hands-on experience in building & implementing AI/ML solutions for senior role.
- Experience with python libraries - Numpy, scikit-learn, OpenCV, Tensorflow, Pytorch, Flask, Django.
- Experience with source version control (Git, Bitbucket).
- Proven knowledge on Rest API, Docker, Google Big Query, VScode.
- Strong analytical skills and data-driven thinking.
- Strong understanding of quantitative analysis methods in relation to financial institutions.
- Ability to clearly communicate modeling results to a wide range of audiences.
- Nice to have.
- Experience in image processing or natural language processing (NLP).
- Solid understanding in collection model.
- Familiar with MLOps concepts.
Experience:
3 years required
Skills:
Big Data, ETL, SQL, Python
Job type:
Full-time
Salary:
negotiable
- Develop and maintain robust data pipelines to ingest, process, and transform raw data into formats suitable for LLM training.
- Conduct meeting with users to understand the data requirements and perform database design based on data understanding and requirements with consideration for performance.
- Maintain data dictionary, relationship and its interpretation.
- Analyze problem and find resolution, as well as work closely with administrators to monitor performance and advise any necessary infrastructure changes.
- Work with business domain experts, data scientists and application developers to identify data that is relevant for analysis.
- Develop big data solutions for batch processing and near real-time streaming.
- Own end-to-end data ETL/ELT process framework from Data Source to Data warehouse.
- Select and integrate appropriate tools and frameworks required to provide requested capabilities.
- Design and develop BI solutions.
- Hands-on development mentality, with a willingness to troubleshoot and solve complex problems.
- Keep abreast of new developments in the big data ecosystem and learn new technologies.
- Ability to effectively work independently and handle multiple priorities.
- Bachelor degree or higher in Computer Science, Computer Engineering, Information Technology, Management Information System or an IT related field.
- 3+ year's experiences in Data Management or Data Engineer (Retail or E-Commerce business is preferrable).
- Expert experience in query language (SQL), Databrick SQL, PostgreSQL.
- Experience in Big Data Technologies like Hadoop, Apache Spark, Databrick.
- Experience in Python is a must.
- Experience in Generative AI is a must.
- Knowledge in machine/statistical learning, data mining is a plus.
- Strong analytical, problem solving, communication and interpersonal skills.
- Having good attitude toward team working and willing to work hard.
- CP AXTRA | Lotus's
- CP AXTRA Public Company Limited.
- Nawamin Office: Buengkum, Bangkok 10230, Thailand.
- By applying for this position, you consent to the collection, use and disclosure of your personal data to us, our recruitment firms and all relevant third parties for the purpose of processing your application for this job position (or any other suitable positions within Lotus's and its subsidiaries, if any). You understand and acknowledge that your personal data will be processed in accordance with the law and our policy. .
Skills:
Automation, Product Owner, Python
Job type:
Full-time
Salary:
negotiable
- The candidate will be responsible for design, and implement new solutions for complex data ingestions from multiple sources to enterprise data products with focus on automation, performance, resilience and scalability, etc.
- Partner with Lead Architect, Data Product Manager (Product Owner) and Lead Data Integration Engineer to create strategic solutions introducing new technologies.
- Work with stakeholders including Management, Domain leads, and Teams to assist with data-related technical issues and support their data infrastructure needs.
- Strong development & programming experience in Informatica (IICS), Python, ADF, Azure Synapse, snowflake, Cosmos and Databricks.
- Solid understanding of databases, real-time integration patterns and ETL/ELT best practices.
- Defining data retention policies, monitoring performance and advising any necessary infrastructure changes based on functional and non-functional requirements.
- Responsible for ensuring enterprise data policies, best practices, standards and processes are followed.
- Write up and maintain technical specifications, design documents and process flow.
- Mentor a team of onshore and offshore development resources to analyze, design, construct and test software development projects focused on analytics and data integration.
- Elaborate user stories for technical team and ensure that the team understands the deliverables.
- Effectively communicate, coordinate & collaborate with business, IT architecture and data teams across multi-functional areas to complete deliverables.
- Provide direction to the Agile development team and stakeholders throughout the project.
- Assist in Data Architecture design, tool selection and data flows analysis.
- Work with large amounts of data, interpret data, analyze results, perform gap analysis and provide ongoing reports.
- Handle ad-hoc analysis & report generation requests from the business.
- Respond to data related inquiries to support business and technical teams.
- 6+years of proven working experience in ETL methodologies, Data integration and data migration. Informatica IICS, Databricks/Spark & Python hands-on development skills a must.
- Clear hands-on experience with database systems - SQL server, Oracle, Azure Synapse, Snowflake and Cosmos, Cloud technologies (e.g., AWS, Azure, Google), and NoSQL databases (e.g., Cosmos, MongoDB, DynamoDB).
- Extensive experience developing complex solutions focused on data ecosystem solutions.
- Extensive knowledge of data and analytics framework supporting data lakes, warehouses, marts, reporting, etc.
- In depth knowledge of data engineering and architecture disciplines.
- Extensive experience working with Big Data tools and building data solutions for advanced analytics.
- Solid understanding of P&C Insurance data.
- Technical expertise regarding data architecture, models and database design development.
- Strong knowledge of and experience with Java, SQL, XML s, Python, ETL frameworks and Databricks.
- Working knowledge/familiarity with Git version control.
- Strong Knowledge of analyzing datasets using Excel.
- Strong analytical skills with the ability to collect, organize, analyze, and disseminate significant amounts of information with attention to detail and accuracy.
- Proficient in learning new technologies with the ability to quickly understand capabilities and work with others to guide these into development.
- Good communication and presentation skills.
- Solid problem solving, decision making and analytical skills.
- Knowledge & working experience with Duck creek is an added plus.
- Knowledge & working experience with Insurity Policy Decisions and/or IEV is an added plus.
- Experience with JIRA.
- Experience being part of high-performance agile teams in a fast-paced environment.
- Must understand the system scope and project objectives to achieve project needs through matrix management and collaboration with other enterprise teams.
- Proven ability to produce results in the analysis, design, testing and deployment of applications.
- Strong team emphasis and relationship building skills; partners well with business and other IT/Data areas.
- Strong coaching / mentoring skills.
- Applies technical knowledge to determine solutions and solve complex problems.
- Ability to be proactive, self-motivated, detail-oriented, creative, inquisitive and persistent.
- Excellent communication and negotiation skills.
- Ability to organize, plan and implement work assignments, juggle competing demands and work under pressure of frequent and tight deadlines.
- The candidate will be responsible for design, and implement new solutions for complex data ingestions from multiple sources to enterprise data products with focus on automation, performance, resilience and scalability, etc.
- Partner with Lead Architect, Data Product Manager (Product Owner) and Lead Data Integration Engineer to create strategic solutions introducing new technologies.
- Work with stakeholders including Management, Domain leads, and Teams to assist with data-related technical issues and support their data infrastructure needs.
- Strong development & programming experience in Informatica (IICS), Python, ADF, Azure Synapse, snowflake, Cosmos and Databricks.
- Solid understanding of databases, real-time integration patterns and ETL/ELT best practices.
- Defining data retention policies, monitoring performance and advising any necessary infrastructure changes based on functional and non-functional requirements.
- Responsible for ensuring enterprise data policies, best practices, standards and processes are followed.
- Write up and maintain technical specifications, design documents and process flow.
- Mentor a team of onshore and offshore development resources to analyze, design, construct and test software development projects focused on analytics and data integration.
- Elaborate user stories for technical team and ensure that the team understands the deliverables.
- Effectively communicate, coordinate & collaborate with business, IT architecture and data teams across multi-functional areas to complete deliverables.
- Provide direction to the Agile development team and stakeholders throughout the project.
- Assist in Data Architecture design, tool selection and data flows analysis.
- Work with large amounts of data, interpret data, analyze results, perform gap analysis and provide ongoing reports.
- Handle ad-hoc analysis & report generation requests from the business.
- Respond to data related inquiries to support business and technical teams.
- 6+years of proven working experience in ETL methodologies, Data integration and data migration. Informatica IICS, Databricks/Spark & Python hands-on development skills a must.
- Clear hands-on experience with database systems - SQL server, Oracle, Azure Synapse, Snowflake and Cosmos, Cloud technologies (e.g., AWS, Azure, Google), and NoSQL databases (e.g., Cosmos, MongoDB, DynamoDB).
- Extensive experience developing complex solutions focused on data ecosystem solutions.
- Extensive knowledge of data and analytics framework supporting data lakes, warehouses, marts, reporting, etc.
- In depth knowledge of data engineering and architecture disciplines.
- Extensive experience working with Big Data tools and building data solutions for advanced analytics.
- Solid understanding of P&C Insurance data.
- Technical expertise regarding data architecture, models and database design development.
- Strong knowledge of and experience with Java, SQL, XML s, Python, ETL frameworks and Databricks.
- Working knowledge/familiarity with Git version control.
- Strong Knowledge of analyzing datasets using Excel.
- Strong analytical skills with the ability to collect, organize, analyze, and disseminate significant amounts of information with attention to detail and accuracy.
- Proficient in learning new technologies with the ability to quickly understand capabilities and work with others to guide these into development.
- Good communication and presentation skills.
- Solid problem solving, decision making and analytical skills.
- Knowledge & working experience with Duck creek is an added plus.
- Knowledge & working experience with Insurity Policy Decisions and/or IEV is an added plus.
- Experience with JIRA.
- Experience being part of high-performance agile teams in a fast-paced environment.
- Must understand the system scope and project objectives to achieve project needs through matrix management and collaboration with other enterprise teams.
- Proven ability to produce results in the analysis, design, testing and deployment of applications.
- Strong team emphasis and relationship building skills; partners well with business and other IT/Data areas.
- Strong coaching / mentoring skills.
- Applies technical knowledge to determine solutions and solve complex problems.
- Ability to be proactive, self-motivated, detail-oriented, creative, inquisitive and persistent.
- Excellent communication and negotiation skills.
- Ability to organize, plan and implement work assignments, juggle competing demands and work under pressure of frequent and tight deadlines.
Experience:
No experience required
Skills:
Mechanical Engineering, Electrical Engineering, English
Job type:
Full-time
- Provide day to day installation, maintenance, and repair of all facilities in the data center.
- 24x7 shift work responsibility when qualified and designated.
- Provide requested reporting and documentation.
- Support of facility, development, and construction teams.
- Perform tasks as assigned by DC operation manager.
- Respond to customer requests, power, cooling, and facility audits.
- First tier investigate any power, communication, or cooling anomalies.
- Attend assigned meetings and training.
- Assist in ensuring customer compliance with GSA Acceptance Usage Policy (AUP).
- Provide technical escort when needed.
- Job Qualifications.
- Must be familiar with safety requirements and OSHA regulations or Thailand safety regulations.
- Basic understanding of electrical and mechanical systems that may be employed in a data center environment. This may include electrical feeders, transformers, generators, switchgear, UPS systems, DC power systems, ATS/STS units, PDU units, air handling units, cooling towers, and fire suppression systems.
- Familiar with Interpret wiring diagrams, schematics, and electrical drawings.
- Ability to express ideas clearly, concisely, and effectively with contractors performing maintenance or upgrades on systems installed in the data center environment.
- Excellent verbal, written, and interpersonal communication skills.
- Ability to analyze and make suggestions for problem resolution.
- Solve problems with good initiative and sound judgment.
- Creativity, problem solving skills, negotiation and systematic thinking.
- Fluent in English both written and verbal (Minimum 500 TOEIC score).
- Goal-Oriented, Unity, Learning, Flexible.
Job type:
Full-time
Salary:
negotiable
- Collect and organize data from various sources, Ensure the accuracy, completeness, and consistency of data for analysis
- Maintain and update datasets, ensuring that they remain current and relevant.
- Apply statistical techniques and data analysis methods to interpret complex datasets.
- Identify trends, patterns, correlations, and anomalies within the data that can inform business decisions.
- Use data visualization tools (e.g., Tableau, Power BI, or Excel) to create interactive charts, dashboards, and graphs that present key insights.
- Communicate findings clearly through visual representation, making complex data easier to understand for non-technical stakeholders.
- Prepare and present reports summarizing data analysis, trends, and recommendations.
- Create regular performance reports, including key performance indicators (KPIs), to track business performance.
- Identify business problems that can be addressed through data analysis.
- Provide actionable insights and recommendations to improve business processes, performance, or strategy.
- Work closely with other departments (e.g., marketing, sales, finance, operations) to understand their data needs and provide analytical support.
- Collaborate with data scientists, business analysts, or IT teams to ensure data is captured, processed, and analyzed correctly.
- Continuously improve analytical processes and methodologies to increase efficiency and accuracy.
- Other job which may assign by Asst. Manager / HOD.
- Problem solving skills
- Negotiations skills
- Time management skills
- Familiar with any management system
- Advanced skills in operating computer office programs
- Good command of English and be able to correspondence with oversea
- Good personality, diligent, willing to work hard, self responsibility and honest.
Skills:
Power BI, SQL, Python, English
Job type:
Full-time
Salary:
negotiable
- Understand and document the business requirements for developing effective data solutions.
- Develop dashboards, reports, and analyses that provide actionable insights.
- Balance retail/wholesale business acumen with data management expertise and technical proficiency.
- Collaborate closely with data engineers, data scientists, and business partners.
- Deliver robust, scalable data solutions with a focus on speed, performance, security, governance, and architecture.
- Mentor junior data analysts and promote best practices within the team.
- Requirements5+ years of experience as a data analyst, preferably in retail or wholesale industries.
- Proven expertise in Power BI, SQL, Python and Cloud Data Platforms.
- Demonstrated ability to translate complex business requirements into effective data solutions.
- Strong leadership skills, including:Mentoring and developing junior analysts.
- Managing cross-functional projects.
- Guiding teams through complex BI development initiatives.
- Excellent communication skills, including:Fluency in verbal and written English.
- Ability to clearly articulate technical concepts to stakeholders with varying technical backgrounds.
- Track record of successful collaboration with various departments and stakeholders.
Skills:
Power BI, SQL, Python, English
Job type:
Full-time
Salary:
negotiable
- Understand and document business requirements for developing data solutions.
- Develop dashboards, reports, and analyses that provide actionable insights.
- Balance retail/wholesale business acumen with data management expertise and technical proficiency.
- Collaborate closely with data engineers, data scientists, and business partners.
- Contribute to the delivery of robust, scalable data solutions with a focus on performance, security, and governance.
- RequirementsBachelor s degree in a STEM or related fields.
- 1-3 years of experience as a data analyst, preferably in the retail or wholesale industries.
- Proficiency in Power BI, SQL, and Python.
- Basic understanding of cloud data platforms.
- Ability to translate business requirements into effective data solutions.
- Strong communication skills, including:Fluency in verbal and written English.
- Ability to clearly articulate technical concepts to stakeholders with varying technical backgrounds.
Experience:
3 years required
Skills:
Business Development, Data Analysis, SQL, English
Job type:
Full-time
Salary:
negotiable
- Lead the development and execution of data-driven strategies to optimize sales and business development efforts within seller segment.
- Analyze large datasets to identify trends, opportunities, and potential risks, providing actionable insights to the sales and category management teams.
- Collaborate with cross-functional teams to design and implement data visualization tools and dashboards for monitoring performance and decision-making.
- Monitor market trends, competitors, and customer behavior to inform category strategies and adjust as needed to maintain a competitive edge.
- Develop and maintain predictive models to forecast sales, identify potential upselling and cross-selling opportunities, and assess the impact of promotional activities.
- Bachelor's degree in Business Administration.
- Minimum of 3 years of experience in data analytics, with a focus on sales and/or category management in a fast-paced, e-commerce environment.
- Proficient in using data analysis tools such as SQL, Python, R, and experience with data visualization platforms like Tableau or Power BI.
- Strong understanding of statistical analysis and modeling techniques, with the ability to communicate complex findings to non-technical stakeholders.
- Excellent interpersonal and communication skills, capable of building relationships and influencing decision-making across different teams.
Skills:
Power BI, Excel, Microsoft Office
Job type:
Full-time
Salary:
negotiable
- วิเคราะห์ และจัดทำรายงานเปรียบเทียบข้อมูล ต่างๆ เช่น รายวัน, รายเดือน, รายปี โดยเปรียบเทียบกับข้อมูลในอดีตเป้าหมาย และการประมาณการในอนาคต เพื่อนำเสนอผู้บริหาร.
- จัดทำงบประมาณประจำปีในส่วนของ ยอดขาย และค่าใช้จ่ายส่งเสริมการขาย.
- จัดทำรายงานสรุป Profit & Loss ประจำเดือนในมุมมองบริหารฯ เพื่อนำเสนอผู้บริหาร.
- จัดทำฐานข้อมูลตามแนวทางต่างๆ เพื่อสนับสนุน การคำนวณ/การทำรายงานของส่วนงานอื่นๆ ที่เกี่ยวข้อง เช่น เป้าหมายการจ่ายเงินจูงใจให้กับพนักงาน, เป้าหมายการจ่ายเงินรางวัล เป็นต้น.
- ออกแบบ Dashboard สำหรับข้อมูลด้านต่างๆ ที่เกี่ยวข้อง ให้ผู้ใช้งานเข้าใจ และสามารถนำไปใช้ได้ง่าย เช่น Power BI.
- ออกแบบ Template ให้ส่วนงานต่างๆที่เกี่ยวข้อง เพื่อรองรับการทำงานในการรวบรวมข้อมูล เพื่อช่วยให้การทำงาน ได้ผลลัพธ์ที่รวดเร็วขึ้น.
- วิเคราะห์ข้อมูลอื่น ๆ ที่เกี่ยวข้อง ตามที่ได้รับมอบหมาย.
- ให้คำแนะนำในกาวิเคราะห์ข้อมูล แก่หน่วยงาน ต่าง ๆ ที่มีการใช้งานข้อมูล.
- งานอื่น ๆ ที่ได้รับมอบหมาย.
- มีทักษะในการใช้คอมพิวเตอร์โปรแกรม MS Excelขั้นสูง.
- มีทักษะการใช้คอมพิวเตอร์อื่นๆ ได้เป็นอย่างดี: Microsoft Office; Word, Power Point.
- มีความสามารถทางการวิเคราะห์ การวางแผน และการจัดการอย่างเป็นระบบ และมีมาตรฐานในการทำงาน.
- มีทักษะในการสื่อสาร และความสามารถ ในการเจรจา ต่อรอง.
- มีความสามารถในการเรียนรู้สิ่งใหม่ ๆ ได้รวดเร็ว.
- มีทักษะในการนำเสนองาน.
- มีทักษะในการใช้ Power BI (ถ้ามี).
- มีความรู้พื้นฐานด้านการเขียนโปรแกรมต่าง ๆ (ถ้ามี).
- ติดต่อสอบถาม.
- บริษัท โมเดิร์นเทรด แมนเนจเม้นท์ จำกัด.
- อาคารเล้าเป้งง้วน 1 ชั้น 26 ถนนวิภาวดีรัสิต แขวงจอมพล เขตจตุจักร แขวงจอมพล เขตจตุจักร จังหวัดกรุงเทพมหานคร.
Skills:
Finance, Compliance
Job type:
Full-time
Salary:
negotiable
- Conduct detailed analysis of Enterprise Service revenue to identify trends in products and services within AIS Group.
- Verify the accuracy and completeness of revenue collection, promotion packages, and new services to ensure compliance with business conditions.
- Develop appropriate QA measures to minimize revenue loss and operational errors.
- Detect and investigate irregularities affecting revenue, such as real loss, opportunity loss, and fraud.
- Collaborate with relevant departments to address and rectify issues impacting revenue.
- Ensure the accuracy of service charges, promotion packages, and offerings for enterprise customers.
- Review and validate the calculation of postpaid voice, IDD, and IR services in the RBM system to prevent revenue loss.
- Utilize data analytics skills to analyze data from various sources, reflecting trends, performance, and efficiency of products and services.
- Prepare analysis reports to support management in strategy formulation and risk assessment.
Skills:
Excel, Python, Power BI
Job type:
Full-time
Salary:
negotiable
- Identify and provide list of non-performing inventory, never been sold inventory, non-Planogram, Inventory to Merchandise & Buyer, follow up action and simulate impact from Mark down price.
- Analyze root cause which impact to non-performing inventory increasing and provide conclusion and recommendation for next step action in term of work in a process with relevant parties.
- Work with Store Operations to follow up execution to clear these inventory.
- Bachelor Degree of Supply Chain, Logistic, Economics, Mathematic and other relate filed.
- Have experience in Data analyst, Inventory Analyst, Inventory Planning at least 3-5 Years.
- Excellent for Excel (Pivot, VLOOKUP, VBA), Python, Power BI, Power Query, Tableau.
- Have experience in Retail business /FMCG would be advantage.
- Good Analytic skills.
Skills:
Data Analysis, Excel, Power BI
Job type:
Full-time
Salary:
negotiable
- Graduate with a Bachelor's/Master's Degree in Economics, Engineer, IT, or other related fields.
- Have an Experience in data analysis or performance reporting.
- Strong data analysis skills using Excel, Power BI, and SQL.
- Able to use VBA/Macro will be given special consideration.
- Experience in Retail businesses will be given special consideration..
- Tasks & responsibilities.
- Prepare and deliver daily, weekly, and monthly business reports to key stakeholders and business controllers.
- Recommend IT solutions for optimizing report generation across Financial, Commercial, and Operational areas.
- Ensure accuracy and consistency in data, readying it for comprehensive reporting.
- Provide data support to internal teams to meet reporting needs.
- Prepare additional ad hoc reports as assigned..
Job type:
Full-time
Salary:
negotiable
- Analyze data and provide business insight.
- Assess risk and design mitigation actions based on data insight.
- Design dashboards / predictive models for decision-making management.
- Lead and Supervise team by projects.
Skills:
Data Analysis, SQL, Problem Solving, English
Job type:
Full-time
Salary:
negotiable
- Working closely with business and technical domain experts to identify data requirements that are relevant for analytics and business intelligence.
- Implement data solutions and data comprehensiveness for data customers.
- Working closely with engineering to ensure data service solutions are ultimately delivered in a timely and cost effective manner.
- Retrieve and prepare data (automated if possible) to support business data analysis.
- Ensure adherence to the highest standards in data privacy protection and data governance.
- Bachelor s of Master s Degree in Computer Science, Computer Engineering, or related.
- Minimum of 1 Year with relational/non-relational database systems and good command in SQL.
- Ability to meet critical deadlines and prioritize multiple tasks in a fast-paced environment.
- Ability to work independently, have strong problem solving and organization skills, with a high initiative and a sense of accountability and ownership.
- Experience with cloud-based platforms such as AWS, Google Cloud platform or similar.
- Experience in data processing batch / real time / nearly realtime.
- Experience with data integration or ETL management tools such as AWS Glue, Databrick,or similar.
- Experience with web or software development with Java,Python or similar.
- Experience with Agile methodology is a plus.
- Good in communication and writing in English.
- Good interpersonal and communication skills.
Skills:
Data Analysis, English
Job type:
Full-time
Salary:
negotiable
- Data Innovation Team.
- ปฎิบัติงานยัง บริษัท Infinitas by Krungthai
- As a data analyst in Infinitas, you would expect a new level of analytical experiences here. Being part of Data Innovation team, your goal is to make data useful for all stakeholders. Under the big umbrella of the leading bank in Thailand, Infinitas leverage the largest financial data sets from both traditional banking and mobile banking services. From customer digital footprint to branch, ATM and call center. We mea ...
- Job Responsibilities
- Conduct data inventory research with product owner, business owner and IT BA to gain full understandings of data availability.
- Communicate with business owners to translate business problem/challenge into actionable analytical solution
- Initiate EDA ideas to tag hidden opportunities for customer, product, channel and other various areas.
- Analyze digital and traditional user journey funnel and customer persona
- Visualize data for fast decision making and insight interpretation
- Define customer segmentations for strategy planning and marketing targeting
- Plan holistic A/B testing campaigns to evaluate data values on business impact
- Design and fulfill monitoring dashboards and automated reports
- English as working language
- Minimum of 3 years data analytics related working experiences
- At least 1 year of working experience directly communicate to business team
- Proficient in Python or SQL
- Advanced hands on experiences with visualization tool
- Strong communication and analytical thinking skills
- Good balance of data and business knowledge
- Fintech or banking industry
- Internet companies with mobile application.
- ท่านสามารถอ่านและศึกษานโยบายความเป็นส่วนตัวของธนาคารกรุงไทย จำกัด (มหาชน) ที่ https://krungthai.com/th/content/privacy-policy ทั้งนี้ ธนาคารไม่มีเจตนาหรือความจำเป็นใดๆ ที่จะประมวลผลข้อมูลส่วนบุคคลที่มีความอ่อนไหว รวมถึงข้อมูลที่เกี่ยวข้องศาสนาและ/หรือหมู่โลหิต ซึ่งอาจปรากฏอยู่ในสำเนาบัตรประจำตัวประชาชนของท่านแต่อย่างใด ดังนั้น กรุณาอย่าอัปโหลดเอกสารใดๆ รวมถึงสำเนาบัตรประจำตัวประชาชน หรือกรอกข้อมูลส่วนบุคคลที่มีความอ่อนไหวหรือข้อมูลอื่นใด ซึ่งไม่เกี่ยวข้องหรือไม่จำเป็นสำหรับวัตถุประสงค์ในการสมัครงานไว้บนเว็บไซต์ นอกจากนี้ กรุณาดำเนินการให้แน่ใจว่าได้ดำเนินการลบข้อมูลส่วนบุคคลที่มีความอ่อนไหว (ถ้ามี) ออกจากเรซูเม่และเอกสารอื่นใดก่อนที่จะอัปโหลดเอกสารดังกล่าวไว้บนเว็บไซต์แล้วด้วย ทั้งนี้ ธนาคารมีความจำเป็นต้องเก็บรวบรวมข้อมูลส่วนบุคคลเกี่ยวกับประวัติอาชญากรรมของท่านเพื่อบรรลุวัตถุประสงค์ในการพิจารณารับบุคคลเข้าทำงาน หรือการตรวจสอบคุณสมบัติ ลักษณะต้องห้าม หรือพิจารณาความเหมาะสมของบุคคลที่จะให้ดำรงตำแหน่ง ซึ่งการให้ความยินยอมเพื่อเก็บรวบรวม ใช้ หรือเปิดเผยข้อมูลส่วนบุคคลเกี่ยวกับประวัติอาชญากรรมของท่านมีความจำเป็นสำหรับการเข้าทำสัญญาและการได้รับการพิจารณาตามวัตถุประสงค์ดังกล่าวข้างต้น ในกรณีที่ท่านไม่ให้ความยินยอมในการเก็บรวบรวม ใช้ หรือเปิดเผยข้อมูลส่วนบุคคลเกี่ยวกับประวัติอาชญากรรม หรือมีการถอนความยินยอมในภายหลัง ธนาคารอาจไม่สามารถดำเนินการเพื่อบรรลุวัตถุประสงค์ดังกล่าวข้างต้นได้ และอาจ ทำให้ท่านสูญเสียโอกาสในการได้รับการพิจารณารับเข้าทำงานกับธนาคาร .
- 1
- 2
- 3
- 4
- 5
- 6
- 13