- No elements found. Consider changing the search query.


Experience:
3 years required
Skills:
ETL, Apache, Python, English
Job type:
Full-time
Salary:
negotiable
- Analyze and organize raw data.
- Combine raw information from different sources.
- Designing and building data models to support business requirements.
- Developing and maintaining data ingestion and processing systems.
- Implementing data storage solutions. (databases and data lakes).
- Ensuring data consistency and accuracy through data validation and cleansing techniques.
- Conduct complex data analysis and report on results.
- Explore ways to enhance data quality and reliability.
- Working together with cross-functional teams to identify and address data-related issues.
- Writes unit/integration tests, contributes to engineering wiki, and documents work.
- Bachelor or Master Degree in Computer Science, Software Engineering, Computer Engineering ICT, IT or any related technical field.
- 2-5 years of experience as a data engineer or in a similar role.
- Experience with schema design and dimensional data modeling.
- Experience and knowledge in Python development.
- Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases.
- A successful history of manipulating, processing and extracting value from large disconnected datasets.
- Experience building and optimizing data pipelines, architectures and data sets.
- Experience designing, building, and maintaining data processing systems.
- Experience with orchestration tools e.g. batch and real-time data processing.
- Experience with CICD pipeline data.
- Experience with big data.
- Familiarity with data integration and ETL tools.
- Strong problem-solving and analytical skills.
- Able to speak Thai fluently and basic command in English.
Experience:
5 years required
Skills:
Python, ETL, Compliance
Job type:
Full-time
Salary:
negotiable
- Design and implement scalable, reliable, and efficient data pipelines for ingesting, processing, and storing large amounts of data from a variety of sources using cloud-based technologies, Python, and PySpark.
- Build and maintain data lakes, data warehouses, and other data storage and processing systems on the cloud.
- Write and maintain ETL/ELT jobs and data integration scripts to ensure smooth and accurate data flow.
- Implement data security and compliance measures to protect data privacy and ensure regulatory compliance.
- Collaborate with data scientists and analysts to understand their data needs and provide them with access to the required data.
- Stay up-to-date on the latest developments in cloud-based data engineering, particularly in the context of Azure, AWS and GCP, and proactively bring new ideas and technologies to the team.
- Monitor and optimize the performance of data pipelines and systems, identifying and resolving any issues or bottlenecks that may arise.
- Bachelor s or Master s degree in Computer Science, Data Science, or a related field.
- Minimum of 5 years of experience as a Data Engineer, with a strong focus on cloud-based data infrastructure.
- Proficient programming skills in Python, Java, or a similar language, with an emphasis on Python.
- Extensive experience with cloud-based data storage and processing technologies, particularly Azure, AWS and GCP.
- Familiarity with ETL/ELT tools and frameworks such as Apache Beam, Apache Spark, or Apache Flink.
- Knowledge of data modeling principles and experience working with SQL databases.
- Strong problem-solving skills and the ability to troubleshoot and resolve issues efficiently.
- Excellent communication and collaboration skills to work effectively with cross-functional teams.
- Location: True Digital Park, Bangkok (Hybrid working).
Skills:
Cloud Computing, SAP, Linux
Job type:
Full-time
Salary:
negotiable
- Acting as the key of Cloud technical aspect for the consulting team to provide the technical consulting to both internal and external customers.
- Design Cloud solution architecture in response to the client s requirement.
- Provide advisory consulting service to the client regarding the True IDC Consulting practices.
- Create Cloud technical requirements to the client s migration plan.
- Experience of designing and implementing comprehensive Cloud computing solutions on various Cloud technologies e.g. AWS, GCP.
- Experience in building multi-tier Service Oriented Architecture (SOA) applications.
- Experience in SAP Cloud Infrastructure in term of architecture & design in AWS, GCP public cloud.
- Knowledge of Linux, Windows, Apache, IIS, NoSQL operations as its architecture toth e Cloud.
- Knowledge of Containerization administrative for both Windows and Linux technologies.
- Knowledge of key concerns and how they are addressed in Cloud Computing such as security, performance and scalability.
- Good in customer objective handling & Good in customer presentation skill.
- Nice to have.
- UNIX shell scripting.
- AWS Certified Solution Architect - Associate.
- GCP Certified Solution Architect - Associate.
Skills:
DevOps, Automation, Kubernetes
Job type:
Full-time
Salary:
negotiable
- Managing 7-8 Professional Service Engineers in responsible for AWS cloud solution architecting and implementation/migration according to the project requirements.
- Team resources management.
- Acting as the key of Cloud technical aspect for the consulting team to provide the technical of AWS cloud consulting to customers.
- Design AWS Cloud solution architecture in response to the client s requirement.
- Define the scope of work & estimate mandays for cloud implementation.
- Managing cloud project delivery to meet the customer requirements timeline.
- Support AWS, GCP cloud partner competency building e.g. AWS Certification and delivery professional service process and documentation.
- Speaker of AWS technical side for True IDC webinar, online event for CloudTalk.
- Key Driving for building team competency expansion to meet the competency roadmap yearly strategy e.g. DevOps, IaC, Automation, Kubernetes, App modernization on AWS cloud.
- Experience in leading cloud AWS implementation and delivery team.
- Experience of designing and implementing comprehensive Cloud computing solutions on various Cloud technologies for AWS, GCP is plus.
- Experience in infra as a code in cloud native (Cloud Formation) or other e.g. Terraform, Ansible implementation.
- Experience in building multi-tier Service Oriented Architecture (SOA) applications.
- Knowledge of Linux, Windows, Apache, IIS, NoSQL operations as its architecture to the Cloud.
- Knowledge of OS administrative for both Windows and UNIX technologies.
- Knowledge of key concerns and how they are addressed in Cloud Computing such as security, performance and scalability.
- Knowledge of Kubernetes, Containers and CI/CD, DevOps.
- Experience with RDBMS designing and implementing over the Cloud.
- Prior experience with application development on the various development solutions as Java,.Net, Python etc.
- Experience in,.Net and/or Spring Framework and RESTful web services.
- UNIX shell scripting.
- AWS Certified Solution Architect - Associate, Prefer Professional level.
Experience:
2 years required
Skills:
Big Data, Good Communication Skills, Scala
Job type:
Full-time
Salary:
negotiable
- You will be involved in all aspects of the project life cycle, including strategy, road-mapping, architecture, implementation and development.
- You will work with business and technical stakeholders to gather and analyse business requirements to convert them into the technical requirements, specifications, mapping documents.
- You will collaborate with technical teams, making sure the newly implemented solutions/technology are meeting business requirements.
- Outputs include workshop sessions and documentation including mapping documents.
- Develop solution proposals that provide details of project scope, approach, deliverables and project timeline.
- Skills and attributes for success.
- 2-4 years of experience in Big Data, data warehouse, data analytics projects, and/or any Information Management related projects.
- Prior experience building large scale enterprise data architectures using commercial and/or open source Data Analytics technologies.
- Ability to produce client ready solution, business understandable presentations and good communication skills to lead and run workshops.
- Strong knowledge of data manipulation languages such as Spark, Scala, Impala, Hive SQL, Apache Nifi and Kafka.
- Data modelling and architecting skills including strong foundation in data warehousing concepts, data normalisation, and dimensional data modelling such as OLAP, or data vault.
- Good knowledge in DevOps engineering using Continuous Integration/ Delivery tools.
- An in depth understanding of Cloud solutions (AWS, Azure and/or GCP) and experienced in integrating into traditional hosting/delivery models.
- Ideally, you ll also have.
- Experience in engaging with both technical and non-technical stakeholders.
- Strong consulting experience and background, including engaging directly with clients.
- Demonstrable Cloud experience with Azure, AWS or GCP.
- Configuration and management of databases.
- Experience with big data tools such as Hadoop, Spark, Kafka.
- Experience with AWS and MS cloud services.
- Python, SQL, Java, C++, Scala.
- Highly motivated individuals with excellent problem-solving skills and the ability to prioritize shifting workloads in a rapidly changing industry. An effective communicator, you ll be a confident leader equipped with strong people management skills and a genuine passion to make things happen in a dynamic organization.
- What working at EY offers.
- Support, coaching and feedback from some of the most engaging colleagues around.
- Opportunities to develop new skills and progress your career.
- The freedom and flexibility to handle your role in a way that s right for you.
- about EY
- As a global leader in assurance, tax, transaction and advisory services, we hire and develop the most passionate people in their field to help build a better working world. This starts with a culture that believes in giving you the training, opportunities and creative freedom to make things better. So that whenever you join, however long you stay, the exceptional EY experience lasts a lifetime.
- If you can confidently demonstrate that you meet the criteria above, please contact us as soon as possible.
- Join us in building a better working world. Apply now!.
Experience:
5 years required
Skills:
Data Analysis, Automation, Python
Job type:
Full-time
Salary:
negotiable
- Work with stakeholders throughout the organization to understand data needs, identify issues or opportunities for leveraging company data to propose solutions for support decision making to drive business solutions.
- Adopting new technology, techniques, and methods such as machine learning or statistical techniques to produce new solutions to problems.
- Conducts advanced data analysis and create the appropriate algorithm to solve analytics problems.
- Improve scalability, stability, accuracy, speed, and efficiency of existing data model.
- Collaborate with internal team and partner to scale up development to production.
- Maintain and fine tune existing analytic model in order to ensure model accuracy.
- Support the enhancement and accuracy of predictive automation capabilities based on valuable internal and external data and on established objectives for Machine Learning competencies.
- Apply algorithms to generate accurate predictions and resolve dataset issues as they arise.
- Be Project manager for Data project and manager project scope, timeline, and budget.
- Manage relationships with stakeholders and coordinate work between different parties as well as providing regular update.
- Control / manage / govern Level 2 support, identify, fix and configuration related problems.
- Keep maintaining/up to date of data modelling and training model etc.
- Run through Data flow diagram for model development.
- EDUCATION.
- Bachelor's degree or higher in computer science, statistics, or operations research or related technical discipline.
- EXPERIENCE.
- At least 5 years experience in a statistical and/or data science role optimization, data visualization, pattern recognition, cluster analysis and segmentation analysis, Expertise in advanced Analytica l techniques such as descriptive statistical modelling and algorithms, machine learning algorithms, optimization, data visualization, pattern recognition, cluster analysis and segmentation analysis.
- Expertise in advanced analytical techniques such as descriptive statistical modelling and algorithms, machine learning algorithms, optimization, data visualization, pattern recognition, cluster analysis and segmentation analysis.
- Experience using analytical tools and languages such as Python, R, SAS, Java, C, C++, C#, Matlab, SPSS IBM, Tableau, Qlikview, Rapid Miner, Apache, Pig, Spotfire, Micro S, SAP HANA, Oracle, or SOL-like languages.
- Experience working with large data sets, simulation/optimization and distributed computing tools (e.g., Map/Reduce, Hadoop, Hive, Spark).
- Experience developing and deploying machine learning model in production environment.
- Knowledge in oil and gas business processes is preferrable.
- OTHER REQUIREMENTS.
Experience:
6 years required
Skills:
Big Data, Good Communication Skills, Scala
Job type:
Full-time
Salary:
negotiable
- Collate technical and functional requirements through workshops with senior stakeholders in risk, actuarial, pricing and product teams.
- Translate business requirements to technical solutions leveraging strong business acumen.
- Analyse current business practice, processes, and procedures as well as identifying future business opportunities for leveraging Data & Analytics solutions on various platforms.
- Develop solution proposals that provide details of project scope, approach, deliverables and project timeline.
- Provide architectural expertise to sales, project and other analytics teams.
- Identify risks, assumptions, and develop pricing estimates for the Data & Analytics solutions.
- Provide solution oversight to delivery architects and teams.
- Skills and attributes for success.
- 6-8 years of experience in Big Data, data warehouse, data analytics projects, and/or any Information Management related projects.
- Prior experience building large scale enterprise data architectures using commercial and/or open source Data Analytics technologies.
- Ability to estimate complexity, effort and cost.
- Ability to produce client ready solution architecture, business understandable presentations and good communication skills to lead and run workshops.
- Strong knowledge of data manipulation languages such as Spark, Scala, Impala, Hive SQL, Apache Nifi and Kafka necessary to build and maintain complex queries, streaming and real-time data pipelines.
- Data modelling and architecting skills including strong foundation in data warehousing concepts, data normalisation, and dimensional data modelling such as OLAP, or data vault.
- Good fundamentals around security integration including Kerberos authentication, SAML and data security and privacy such as data masking and tokenisation techniques.
- Good knowledge in DevOps engineering using Continuous Integration/ Delivery tools.
- An in depth understanding of Cloud solutions (AWS, Azure and/or GCP) and experienced in integrating into traditional hosting/delivery models.
- Ideally, you ll also have.
- Experience in engaging with both technical and non-technical stakeholders.
- Strong consulting experience and background, including engaging directly with clients.
- Demonstrable Cloud experience with Azure, AWS or GCP.
- Configuration and management of databases.
- Experience with big data tools such as Hadoop, Spark, Kafka.
- Experience with AWS and MS cloud services.
- Python, SQL, Java, C++, Scala.
- Highly motivated individuals with excellent problem-solving skills and the ability to prioritize shifting workloads in a rapidly changing industry. An effective communicator, you ll be a confident leader equipped with strong people management skills and a genuine passion to make things happen in a dynamic organization.
- What working at EY offers.
- Support, coaching and feedback from some of the most engaging colleagues around.
- Opportunities to develop new skills and progress your career.
- The freedom and flexibility to handle your role in a way that s right for you.
- about EY
- As a global leader in assurance, tax, transaction and advisory services, we hire and develop the most passionate people in their field to help build a better working world. This starts with a culture that believes in giving you the training, opportunities and creative freedom to make things better. So that whenever you join, however long you stay, the exceptional EY experience lasts a lifetime.
- If you can confidently demonstrate that you meet the criteria above, please contact us as soon as possible.
- Join us in building a better working world. Apply now!.
Skills:
Compliance
Job type:
Full-time
Salary:
negotiable
- Ensure data availability, data integrity, and quality.
- Conduct regular system audits and generate reports on system performance and usage.
- Data Analysis and Reporting.
- Collect, analyze, and interpret data to provide actionable insights for business strategy.
- Develop and maintain dashboards, reports, and visualizations for various stakeholders.
- Support data-driven decision-making processes across the organization.
- Technical Support and Troubleshooting.
- Provide and collaborate with related teams on technical support for databases, visualization tools, and other systems, promptly resolving issues.
- Project Management.
- Work with cross-functional teams to gather requirements and develop project plans.
- Monitor project progress and adjust plans as necessary to meet objectives.
- System Development and Integration.
- Identify opportunities for system improvements and innovations.
- Design and implement system enhancements and integrations with other business applications.
- Ensure compliance with industry standards and regulatory requirements.
- Bachelor's or Master's Degree in MIS, IT, computer science, statistics, mathematics, business, or related field.
- Minimum of 5 years' experience in BI, dashboard, and data analysis roles.
- Experienced in the data analytics lifecycle, including problem identification, measurement/matrix, exploratory data analysis, and data insight presentation.
- Data Visualization (Microsoft Power BI, Tableau), Apache Superset is a plus.
- Strong creative and analytical problem-solving capabilities.
- Communication skills.
- Knowledge of database concepts and management.
- Excellent with MS Excel, SQL, Python, Airflow, and Pyspark is a plus.
Skills:
Automation, Power BI, Tableau, English
Job type:
Full-time
Salary:
negotiable
- Develop data pipeline automation using Azure technologies, Databricks and Data Factory.
- Understand data, reports and dashboards requirements, develop data visualization using Power BI, Tableau by working across workstreams to support data requirements including reports and dashboards.
- Analyze and perform data profiling to understand data patterns following Data Quality and Data Management processes.
- Proof of concept and Test solutions of ETL tools for customer relationship management.
- Develop and maintain customer profile data service using Grails framework, Apache Hadoop, Shell script, and Impala-shell.
- Establishes requirements and coordinates production with programmers to control the solution.
- Defines application problems by conferring with users and analyzing procedures and processes.
- Writes documentation such as a technical specification, troubleshooting, and application log to serve as a reference.
- 3 years+ experience in big data technology, data engineering, data science, data analytic application system development.
- Have an experience of unstructured data for business Intelligence or computer science would be advantage.
- Java, Groovy, JavaScript, Perl, Shell Script.
- Grails Framework, Catalyst Framework, Nodejs.
- MySQL, MongoDB, MariaDB, Apache Hadoop, Impala.
- Documentation, testing, and maintenance.
- Intelli J IDEA, Visual Studio Code, Postman, RoboMongo, MobaXterm, WinSCP.
- English communication.
- Fast learner, Creativity and Team player.
Experience:
5 years required
Skills:
Java, Spring Boot, Thai, English
Job type:
Full-time
Salary:
negotiable
- Responsible for the detail solution design of required solution aligning with Technology Solution Lead, Senior BA/BA and all Partner team and managing the deliverable of detail technical specification by respective team resources ensuring the quality of deliverable meet the expected requirement.
- Coordinate with BU/SU to gather requirement and design application architecture aligned with IT Blueprint as well as business needs and directions.
- Manage resources to provide the related service including adaptation on application ...
- Application Solution Delivery.
- Leading the deliverable of application solution with well design architecture and aligned with standard in IT Blueprint.
- Manage resources to apply the proper technology in developing valued added solution to serve business needs.
- Manage resources to deliver automated and fully integrated solution for end to end work process.
- Bachelor's degree or higher in Information Technology, Computer Science, or other related fields.
- At least 5 years of experience in core Java fundamentals, Java 8+, Spring, Spring Boot, and testing frameworks like JMeter.
- Experience in Application Architecture and System Integration using technologies such as Unix, Linux, Apache, JBoss, SQL databases, MQ, Redis.
- Hands-on experience with Next.js, React, Java Spring Boot, Bootstrap, and Tailwind.
- Familiarity with message queues (MQ) and Redis, along with experience using automation tools, Git control, and support tools.
- CI/CD implementation experience from scratch, using tools like GitHub, GitLab, Bitbucket, and Jenkins.
- Experience with Jboss, Openshift, Docker, and Firebase messaging services.
- Experience in network and security, including resolving firewall connection issues, addressing integration challenges, load balancing, and disaster recovery planning.
- Experience developing Unix Shell Scripting, SQL, Java, and Python from scratch.
- Experience in application and database design.
- Experience in Production Support Management, including Incident and Problem Management.
- Knowledge of banking products or Banking and the financial industry would be advantageous.
Skills:
Big Data, ETL, SQL
Job type:
Full-time
Salary:
negotiable
- Develop and maintain robust data pipelines to ingest, process, and transform raw data into formats suitable for LLM training.
- Conduct meeting with users to understand the data requirements and perform database design based on data understanding and requirements with consideration for performance.
- Maintain data dictionary, relationship and its interpretation.
- Analyze problem and find resolution, as well as work closely with administrators to monitor performance and advise any necessary infrastructure changes.
- Work with business domain experts, data scientists and application developers to identify data that is relevant for analysis.
- Develop big data solutions for batch processing and near real-time streaming.
- Own end-to-end data ETL/ELT process framework from Data Source to Data warehouse.
- Select and integrate appropriate tools and frameworks required to provide requested capabilities.
- Design and develop BI solutions.
- Hands-on development mentality, with a willingness to troubleshoot and solve complex problems.
- Keep abreast of new developments in the big data ecosystem and learn new technologies.
- Ability to effectively work independently and handle multiple priorities.
- Bachelor degree or higher in Computer Science, Computer Engineering, Information Technology, Management Information System or an IT related field.
- 3+ year's experiences in Data Management or Data Engineer (Retail or E-Commerce business is preferrable).
- Expert experience in query language (SQL), Databrick SQL, PostgreSQL.
- Experience in Big Data Technologies like Hadoop, Apache Spark, Databrick.
- Experience in Python is a must.
- Experience in Generative AI is a must.
- Knowledge in machine/statistical learning, data mining is a plus.
- Strong analytical, problem solving, communication and interpersonal skills.
- Having good attitude toward team working and willing to work hard.
- CP AXTRA | Lotus's
- CP AXTRA Public Company Limited.
- Nawamin Office: Buengkum, Bangkok 10230, Thailand.
- By applying for this position, you consent to the collection, use and disclosure of your personal data to us, our recruitment firms and all relevant third parties for the purpose of processing your application for this job position (or any other suitable positions within Lotus's and its subsidiaries, if any). You understand and acknowledge that your personal data will be processed in accordance with the law and our policy. .
- 1