- No elements found. Consider changing the search query.


Skills:
Sales, Hadoop, ETL, English
Job type:
Full-time
Salary:
negotiable
- Bachelor's degree or equivalent practical experience.
- 10 years of experience in software sales or account management.
- Experience promoting analytics, data warehousing, or data management software.
- Ability to communicate fluently in English and Thai fluently in order to communicate for APAC customers.
- Preferred qualifications:Experience with business intelligence front-end, data analytics middleware, or back-end data warehouse technologies.
- Experience working with sales engineers and customer technical leads to build business cases for transformation and accompanying plans for implementation.
- Understanding of data analytics technology stack (e.g., Hadoop/Spark, Columnar data warehouses, data streaming, ETL and data governance, predictive analytics, data science framework, etc.).
- Understanding of Google Cloud Data and Analytics offerings (e.g., BigQuery, Looker, Dataproc, Pub/Sub, etc.).
- Ability to engage and influence executive stakeholders as a business advisor and thought leader in data and analytics.
- Excellent business acumen and problem-solving skills.
- As a member of the Google Cloud team, you inspire leading companies, schools, and government agencies to work smarter with Google tools like Google Workspace, Search, and Chrome. You advocate for the innovative power of our products to make organizations more productive, collaborative, and mobile. Your guiding light is doing what's right for the customer, you will meet customers exactly where they are at and provide them the best solutions for innovation. Using your passion for Google products, you help spread the magic of Google to organizations around the world.
- In this role, you will build an understanding of our customers businesses and bring expertise to executive-level relationships to help them deliver their strategies. You will leverage expertise promoting data analytics and work with account teams, customer engineering, and partners to ensure customer outcomes.
- Google Cloud accelerates every organization's ability to digitally transform its business and industry. We deliver enterprise-grade solutions that leverage Google's cutting-edge technology, and tools that help developers build more sustainably. Customers in more than 200 countries and territories turn to Google Cloud as their trusted partner to enable growth and solve their most critical business problems.
- ResponsibilitiesCalibrate business against the objectives and key results, forecast and report the state of the business for the assigned territory.
- Build and maintain executive relationships with customers as the data analytics subject matter expert, influencing long-term strategic direction.
- Develop and execute strategic account plans, including a broader enterprise plan across key industries. Focus on building accounts.
- Assist customers in identifying use cases suitable for Google Cloud Data and Analytics solutions, articulating key solution differentiation and measurable business impacts.
- Work with Google account and technical teams to develop and drive pipelines, and provide expertise. Develop Go-To-Market (GTM) efforts with Google Cloud Platform partners.
- Google is proud to be an equal opportunity workplace and is an affirmative action employer. We are committed to equal employment opportunity regardless of race, color, ancestry, religion, sex, national origin, sexual orientation, age, citizenship, marital status, disability, gender identity or Veteran status. We also consider qualified applicants regardless of criminal histories, consistent with legal requirements. See alsoGoogle's EEO Policy andEEO is the Law. If you have a disability or special need that requires accommodation, please let us know by completing ourAccommodations for Applicants form.
Experience:
2 years required
Skills:
Big Data, Good Communication Skills, Scala
Job type:
Full-time
Salary:
negotiable
- You will be involved in all aspects of the project life cycle, including strategy, road-mapping, architecture, implementation and development.
- You will work with business and technical stakeholders to gather and analyse business requirements to convert them into the technical requirements, specifications, mapping documents.
- You will collaborate with technical teams, making sure the newly implemented solutions/technology are meeting business requirements.
- Outputs include workshop sessions and documentation including mapping documents.
- Develop solution proposals that provide details of project scope, approach, deliverables and project timeline.
- Skills and attributes for success.
- 2-4 years of experience in Big Data, data warehouse, data analytics projects, and/or any Information Management related projects.
- Prior experience building large scale enterprise data architectures using commercial and/or open source Data Analytics technologies.
- Ability to produce client ready solution, business understandable presentations and good communication skills to lead and run workshops.
- Strong knowledge of data manipulation languages such as Spark, Scala, Impala, Hive SQL, Apache Nifi and Kafka.
- Data modelling and architecting skills including strong foundation in data warehousing concepts, data normalisation, and dimensional data modelling such as OLAP, or data vault.
- Good knowledge in DevOps engineering using Continuous Integration/ Delivery tools.
- An in depth understanding of Cloud solutions (AWS, Azure and/or GCP) and experienced in integrating into traditional hosting/delivery models.
- Ideally, you ll also have.
- Experience in engaging with both technical and non-technical stakeholders.
- Strong consulting experience and background, including engaging directly with clients.
- Demonstrable Cloud experience with Azure, AWS or GCP.
- Configuration and management of databases.
- Experience with big data tools such as Hadoop, Spark, Kafka.
- Experience with AWS and MS cloud services.
- Python, SQL, Java, C++, Scala.
- Highly motivated individuals with excellent problem-solving skills and the ability to prioritize shifting workloads in a rapidly changing industry. An effective communicator, you ll be a confident leader equipped with strong people management skills and a genuine passion to make things happen in a dynamic organization.
- What working at EY offers.
- Support, coaching and feedback from some of the most engaging colleagues around.
- Opportunities to develop new skills and progress your career.
- The freedom and flexibility to handle your role in a way that s right for you.
- about EY
- As a global leader in assurance, tax, transaction and advisory services, we hire and develop the most passionate people in their field to help build a better working world. This starts with a culture that believes in giving you the training, opportunities and creative freedom to make things better. So that whenever you join, however long you stay, the exceptional EY experience lasts a lifetime.
- If you can confidently demonstrate that you meet the criteria above, please contact us as soon as possible.
- Join us in building a better working world. Apply now!.
Skills:
Big Data, Research, Statistics
Job type:
Full-time
Salary:
negotiable
- Design, code, experiment and implement models and algorithms to maximize customer experience, supply side value, business outcomes, and infrastructure readiness.
- Mine a big data of hundreds of millions of customers and more than 600M daily user generated events, supplier and pricing data, and discover actionable insights to drive improvements and innovation.
- Work with developers and a variety of business owners to deliver daily results with the best quality.
- Research discover and harness new ideas that can make a difference.
- What You'll Need to Succeed.
- 4+ years hands-on data science experience.
- Excellent understanding of AI/ML/DL and Statistics, as well as coding proficiency using related open source libraries and frameworks.
- Significant proficiency in SQL and languages like Python, PySpark and/or Scala.
- Can lead, work independently as well as play a key role in a team.
- Good communication and interpersonal skills for working in a multicultural work environment.
- It's Great if You Have.
- PhD or MSc in Computer Science / Operations Research / Statistics or other quantitative fields.
- Experience in NLP, image processing and/or recommendation systems.
- Hands on experience in data engineering, working with big data framework like Spark/Hadoop.
- Experience in data science for e-commerce and/or OTA.
- We welcome both local and international applications for this role. Full visa sponsorship and relocation assistance available for eligible candidates.
- Equal Opportunity Employer.
- At Agoda, we pride ourselves on being a company represented by people of all different backgrounds and orientations. We prioritize attracting diverse talent and cultivating an inclusive environment that encourages collaboration and innovation. Employment at Agoda is based solely on a person's merit and qualifications. We are committed to providing equal employment opportunity regardless of sex, age, race, color, national origin, religion, marital status, pregnancy, sexual orientation, gender identity, disability, citizenship, veteran or military status, and other legally protected characteristics.
- We will keep your application on file so that we can consider you for future vacancies and you can always ask to have your details removed from the file. For more details please read our privacy policy.
- To all recruitment agencies: Agoda does not accept third party resumes. Please do not send resumes to our jobs alias, Agoda employees or any other organization location. Agoda is not responsible for any fees related to unsolicited resumes.
Skills:
Big Data, Research, Statistics
Job type:
Full-time
Salary:
negotiable
- Design, code, experiment and implement models and algorithms to maximize customer experience, supply side value, business outcomes, and infrastructure readiness.
- Mine a big data of hundreds of millions of customers and more than 600M daily user generated events, supplier and pricing data, and discover actionable insights to drive improvements and innovation.
- Work with developers and a variety of business owners to deliver daily results with the best quality.
- Research discover and harness new ideas that can make a difference.
- What You'll Need to Succeed.
- 4+ years hands-on data science experience.
- Excellent understanding of AI/ML/DL and Statistics, as well as coding proficiency using related open source libraries and frameworks.
- Significant proficiency in SQL and languages like Python, PySpark and/or Scala.
- Can lead, work independently as well as play a key role in a team.
- Good communication and interpersonal skills for working in a multicultural work environment.
- It's Great if You Have.
- PhD or MSc in Computer Science / Operations Research / Statistics or other quantitative fields.
- Experience in NLP, image processing and/or recommendation systems.
- Hands on experience in data engineering, working with big data framework like Spark/Hadoop.
- Experience in data science for e-commerce and/or OTA.
- We welcome both local and international applications for this role. Full visa sponsorship and relocation assistance available for eligible candidates.
- Equal Opportunity Employer.
- At Agoda, we pride ourselves on being a company represented by people of all different backgrounds and orientations. We prioritize attracting diverse talent and cultivating an inclusive environment that encourages collaboration and innovation. Employment at Agoda is based solely on a person's merit and qualifications. We are committed to providing equal employment opportunity regardless of sex, age, race, color, national origin, religion, marital status, pregnancy, sexual orientation, gender identity, disability, citizenship, veteran or military status, and other legally protected characteristics.
- We will keep your application on file so that we can consider you for future vacancies and you can always ask to have your details removed from the file. For more details please read our privacy policy.
- To all recruitment agencies: Agoda does not accept third party resumes. Please do not send resumes to our jobs alias, Agoda employees or any other organization location. Agoda is not responsible for any fees related to unsolicited resumes.
Experience:
5 years required
Skills:
Data Analysis, Automation, Python
Job type:
Full-time
Salary:
negotiable
- Work with stakeholders throughout the organization to understand data needs, identify issues or opportunities for leveraging company data to propose solutions for support decision making to drive business solutions.
- Adopting new technology, techniques, and methods such as machine learning or statistical techniques to produce new solutions to problems.
- Conducts advanced data analysis and create the appropriate algorithm to solve analytics problems.
- Improve scalability, stability, accuracy, speed, and efficiency of existing data model.
- Collaborate with internal team and partner to scale up development to production.
- Maintain and fine tune existing analytic model in order to ensure model accuracy.
- Support the enhancement and accuracy of predictive automation capabilities based on valuable internal and external data and on established objectives for Machine Learning competencies.
- Apply algorithms to generate accurate predictions and resolve dataset issues as they arise.
- Be Project manager for Data project and manager project scope, timeline, and budget.
- Manage relationships with stakeholders and coordinate work between different parties as well as providing regular update.
- Control / manage / govern Level 2 support, identify, fix and configuration related problems.
- Keep maintaining/up to date of data modelling and training model etc.
- Run through Data flow diagram for model development.
- EDUCATION.
- Bachelor's degree or higher in computer science, statistics, or operations research or related technical discipline.
- EXPERIENCE.
- At least 5 years experience in a statistical and/or data science role optimization, data visualization, pattern recognition, cluster analysis and segmentation analysis, Expertise in advanced Analytica l techniques such as descriptive statistical modelling and algorithms, machine learning algorithms, optimization, data visualization, pattern recognition, cluster analysis and segmentation analysis.
- Expertise in advanced analytical techniques such as descriptive statistical modelling and algorithms, machine learning algorithms, optimization, data visualization, pattern recognition, cluster analysis and segmentation analysis.
- Experience using analytical tools and languages such as Python, R, SAS, Java, C, C++, C#, Matlab, SPSS IBM, Tableau, Qlikview, Rapid Miner, Apache, Pig, Spotfire, Micro S, SAP HANA, Oracle, or SOL-like languages.
- Experience working with large data sets, simulation/optimization and distributed computing tools (e.g., Map/Reduce, Hadoop, Hive, Spark).
- Experience developing and deploying machine learning model in production environment.
- Knowledge in oil and gas business processes is preferrable.
- OTHER REQUIREMENTS.
Experience:
8 years required
Skills:
Automation, Finance, Compliance
Job type:
Full-time
Salary:
negotiable
- Work closely with stakeholders from different verticals in Finance.
- Consult stakeholders to propose the best suited automation solution.
- Build/manage/optimize E2E automations while ensuring security and compliance aspects.
- Become familiar with finance ecosystem at Agoda.
- Work with Gen AI technology to deliver efficiency saving automation projects.
- Undergraduate/Post graduate degree.
- 8+ years of experience with RPA tools and process mining tools & OCR. Ideally 10+ yrs but not mandatory. Mandatory: Power Automate + Celonis+ Rossum Good to have: Blue Prism, Alteryx, Blue prism process Intelligence (BPPI), Interact.
- End-to-end delivery of at least 5 processes using Power Automate Desktop.
- Experience in full stack development with framework experience on- Django, React JS, Node JS and Bootstrap.
- Experience in advanced programming and Gen AI implementation using JavaScript, Python with expertise in vector embedding databases.
- Strong understanding of ITGC controls and payment wallet systems. Oversee the development and deployment of RPA solutions for digital wallets.
- High sense of ownership and growth mindset, ability to be self-directed.
- Establish robust RPA governance and strong control over bots ensuring high Bot utilization factor.
- Excellent communication skills and ability to influence peers and build strong relationships within Finance and cross-functionally.
- Advanced Excel skills.
- Accounting/Financial knowledge and commercial acumen.
- Experience in full stack development - UI/UX / API dev /backend/Postgres DB.
- Solid technical/functional knowledge in procure to pay (P2P), Order to cash (O2C) & Record to Report (R2R) cycle.
- Familiarity with scrum/agile methodology.
- Other helpful skills - Hadoop, Celonis Certification, Power Automate Desktop Certification.
- Equal Opportunity Employer.
- At Agoda, we pride ourselves on being a company represented by people of all different backgrounds and orientations. We prioritize attracting diverse talent and cultivating an inclusive environment that encourages collaboration and innovation. Employment at Agoda is based solely on a person's merit and qualifications. We are committed to providing equal employment opportunity regardless of sex, age, race, color, national origin, religion, marital status, pregnancy, sexual orientation, gender identity, disability, citizenship, veteran or military status, and other legally protected characteristics.
- We will keep your application on file so that we can consider you for future vacancies and you can always ask to have your details removed from the file. For more details please read our privacy policy.
- To all recruitment agencies: Agoda does not accept third party resumes. Please do not send resumes to our jobs alias, Agoda employees or any other organization location. Agoda is not responsible for any fees related to unsolicited resumes.
Experience:
6 years required
Skills:
Big Data, Good Communication Skills, Scala
Job type:
Full-time
Salary:
negotiable
- Collate technical and functional requirements through workshops with senior stakeholders in risk, actuarial, pricing and product teams.
- Translate business requirements to technical solutions leveraging strong business acumen.
- Analyse current business practice, processes, and procedures as well as identifying future business opportunities for leveraging Data & Analytics solutions on various platforms.
- Develop solution proposals that provide details of project scope, approach, deliverables and project timeline.
- Provide architectural expertise to sales, project and other analytics teams.
- Identify risks, assumptions, and develop pricing estimates for the Data & Analytics solutions.
- Provide solution oversight to delivery architects and teams.
- Skills and attributes for success.
- 6-8 years of experience in Big Data, data warehouse, data analytics projects, and/or any Information Management related projects.
- Prior experience building large scale enterprise data architectures using commercial and/or open source Data Analytics technologies.
- Ability to estimate complexity, effort and cost.
- Ability to produce client ready solution architecture, business understandable presentations and good communication skills to lead and run workshops.
- Strong knowledge of data manipulation languages such as Spark, Scala, Impala, Hive SQL, Apache Nifi and Kafka necessary to build and maintain complex queries, streaming and real-time data pipelines.
- Data modelling and architecting skills including strong foundation in data warehousing concepts, data normalisation, and dimensional data modelling such as OLAP, or data vault.
- Good fundamentals around security integration including Kerberos authentication, SAML and data security and privacy such as data masking and tokenisation techniques.
- Good knowledge in DevOps engineering using Continuous Integration/ Delivery tools.
- An in depth understanding of Cloud solutions (AWS, Azure and/or GCP) and experienced in integrating into traditional hosting/delivery models.
- Ideally, you ll also have.
- Experience in engaging with both technical and non-technical stakeholders.
- Strong consulting experience and background, including engaging directly with clients.
- Demonstrable Cloud experience with Azure, AWS or GCP.
- Configuration and management of databases.
- Experience with big data tools such as Hadoop, Spark, Kafka.
- Experience with AWS and MS cloud services.
- Python, SQL, Java, C++, Scala.
- Highly motivated individuals with excellent problem-solving skills and the ability to prioritize shifting workloads in a rapidly changing industry. An effective communicator, you ll be a confident leader equipped with strong people management skills and a genuine passion to make things happen in a dynamic organization.
- What working at EY offers.
- Support, coaching and feedback from some of the most engaging colleagues around.
- Opportunities to develop new skills and progress your career.
- The freedom and flexibility to handle your role in a way that s right for you.
- about EY
- As a global leader in assurance, tax, transaction and advisory services, we hire and develop the most passionate people in their field to help build a better working world. This starts with a culture that believes in giving you the training, opportunities and creative freedom to make things better. So that whenever you join, however long you stay, the exceptional EY experience lasts a lifetime.
- If you can confidently demonstrate that you meet the criteria above, please contact us as soon as possible.
- Join us in building a better working world. Apply now!.
Skills:
Automation, Power BI, Tableau, English
Job type:
Full-time
Salary:
negotiable
- Develop data pipeline automation using Azure technologies, Databricks and Data Factory.
- Understand data, reports and dashboards requirements, develop data visualization using Power BI, Tableau by working across workstreams to support data requirements including reports and dashboards.
- Analyze and perform data profiling to understand data patterns following Data Quality and Data Management processes.
- Proof of concept and Test solutions of ETL tools for customer relationship management.
- Develop and maintain customer profile data service using Grails framework, Apache Hadoop, Shell script, and Impala-shell.
- Establishes requirements and coordinates production with programmers to control the solution.
- Defines application problems by conferring with users and analyzing procedures and processes.
- Writes documentation such as a technical specification, troubleshooting, and application log to serve as a reference.
- 3 years+ experience in big data technology, data engineering, data science, data analytic application system development.
- Have an experience of unstructured data for business Intelligence or computer science would be advantage.
- Java, Groovy, JavaScript, Perl, Shell Script.
- Grails Framework, Catalyst Framework, Nodejs.
- MySQL, MongoDB, MariaDB, Apache Hadoop, Impala.
- Documentation, testing, and maintenance.
- Intelli J IDEA, Visual Studio Code, Postman, RoboMongo, MobaXterm, WinSCP.
- English communication.
- Fast learner, Creativity and Team player.
Skills:
Big Data, ETL, SQL
Job type:
Full-time
Salary:
negotiable
- Develop and maintain robust data pipelines to ingest, process, and transform raw data into formats suitable for LLM training.
- Conduct meeting with users to understand the data requirements and perform database design based on data understanding and requirements with consideration for performance.
- Maintain data dictionary, relationship and its interpretation.
- Analyze problem and find resolution, as well as work closely with administrators to monitor performance and advise any necessary infrastructure changes.
- Work with business domain experts, data scientists and application developers to identify data that is relevant for analysis.
- Develop big data solutions for batch processing and near real-time streaming.
- Own end-to-end data ETL/ELT process framework from Data Source to Data warehouse.
- Select and integrate appropriate tools and frameworks required to provide requested capabilities.
- Design and develop BI solutions.
- Hands-on development mentality, with a willingness to troubleshoot and solve complex problems.
- Keep abreast of new developments in the big data ecosystem and learn new technologies.
- Ability to effectively work independently and handle multiple priorities.
- Bachelor degree or higher in Computer Science, Computer Engineering, Information Technology, Management Information System or an IT related field.
- 3+ year's experiences in Data Management or Data Engineer (Retail or E-Commerce business is preferrable).
- Expert experience in query language (SQL), Databrick SQL, PostgreSQL.
- Experience in Big Data Technologies like Hadoop, Apache Spark, Databrick.
- Experience in Python is a must.
- Experience in Generative AI is a must.
- Knowledge in machine/statistical learning, data mining is a plus.
- Strong analytical, problem solving, communication and interpersonal skills.
- Having good attitude toward team working and willing to work hard.
- CP AXTRA | Lotus's
- CP AXTRA Public Company Limited.
- Nawamin Office: Buengkum, Bangkok 10230, Thailand.
- By applying for this position, you consent to the collection, use and disclosure of your personal data to us, our recruitment firms and all relevant third parties for the purpose of processing your application for this job position (or any other suitable positions within Lotus's and its subsidiaries, if any). You understand and acknowledge that your personal data will be processed in accordance with the law and our policy. .
Skills:
Data Analysis, ETL, Data Warehousing
Job type:
Full-time
Salary:
negotiable
- Data Architecture: Design, develop, and maintain the overall data architecture and data pipeline systems to ensure efficient data flow and accessibility for analytical purposes.
- Data Integration: Integrate data from multiple sources, including point-of-sale systems, customer databases, e-commerce platforms, supply chain systems, and other relevant data sources, ensuring data quality and consistency.
- Data Modeling: Design and implement data models that are optimized for scalability, ...
- Data Transformation and ETL: Develop and maintain efficient Extract, Transform, and Load (ETL) processes to transform raw data into a structured format suitable for analysis and reporting.
- Data Warehousing: Build and maintain data warehouses or data marts that enable efficient storage and retrieval of structured and unstructured data for reporting and analytics purposes.
- Data Quality and Monitoring: Implement data quality checks and monitoring mechanisms to identify and resolve data inconsistencies, anomalies, and issues in a timely manner.
- Performance Optimization: Optimize data processing and query performance to ensure efficient data retrieval and analysis, considering factors such as data volume, velocity, and variety.
- Bachelor's or master's degree in computer science, information systems, or a related field.
- Strong programming skills in languages such as Python, SQL. C++ is plus.
- At least 5 year experience with data modeling, database design, and data warehousing concepts.
- Proficiency in working with relational databases (e.g., MySQL, PostgreSQL) and big data technologies (e.g., Hadoop, Spark, Hive).
- Familiarity with cloud-based data platforms, such as AWS.
- Knowledge of ETL tools and techniques for data integration and transformation.
- Location: BTS Ekkamai
- Working Day: Mon-Fri (WFA Every Friday).
- 1