- No elements found. Consider changing the search query.
ทักษะ:
Research, ETL, Automation
ประเภทงาน:
งานประจำ
เงินเดือน:
สามารถต่อรองได้
- Lead the design and development of data architecture, ensuring scalability, security, and alignment with business strategy.
- Oversee the collection, transformation, and integration of data from multiple internal and external sources.
- Conduct advanced research and troubleshooting to address complex business and technical problems.
- Design, build, and optimize data pipelines and ETL processes to handle large-scale and real-time data.
- Implement automation solutions to minimize manual intervention and improve data efficiency.
- Provide technical leadership and mentorship to junior engineers, ensuring best practices in coding, testing, and deployment.
- Collaborate with cross-functional stakeholders including Data Scientists, Analysts, and Business Leaders to deliver actionable insights.
- Evaluate and recommend new tools, frameworks, and technologies to enhance data engineering capabilities.
- Job SpecificationBachelor s Degree in Information Technology, Computer Science, Statistics, Mathematics, Business, or a related field (Master s Degree is a plus).
- Minimum of 5 years experience in data engineering, with at least 2 years in a senior or lead role.
- Proven expertise in the data analytics lifecycle, including business problem framing, KPI/metrics design, exploratory analysis, and presenting data insights.
- Strong hands-on experience with cloud platforms (AWS, GCP, Azure) and advanced programming skills in Python, Java, PySpark.
- Solid knowledge of data processing, ETL frameworks, data warehousing, and messaging queue systems (e.g., Kafka).
- Demonstrated experience in designing highly scalable, resilient, and secure data systems.
ประเภทงาน:
งานประจำ
เงินเดือน:
สามารถต่อรองได้
- We are looking for a skilled Data Engineer to join our team and help build and maintain our data infrastructure. The ideal candidate will be responsible for designing, implementing, and managing our data processing systems and pipelines. You will work closely with data scientists, analysts, and other teams to ensure efficient and reliable data flow throughout the organization.
- Design, develop, and maintain scalable data pipelines for batch and real-time processing.
- Implement ETL processes to extract data from various sources and load it into data warehouses or data lakes.
- Optimize data storage and retrieval processes for improved performance.
- Collaborate with data scientists and analysts to understand their data requirements and provide appropriate solutions.
- Ensure data quality, consistency, and reliability across all data systems.
- Develop and maintain data models and schemas.
- Implement data security measures and access controls.
- Troubleshoot data-related issues and optimize system performance.
- Stay up-to-date with emerging technologies and industry trends in data engineering.
- Document data architectures, pipelines, and processes.
- Bachelor's degree in Computer Science, Engineering, or a related fields 2. 2-4 years of experience in data engineering or similar roles 3. Strong programming skills in Python, Java, or Scala 4. Proficiency in SQL and experience with relational databases (e.g., Databrick, PostgreSQL, MySQL) 5. Familiarity with cloud platforms (AWS, Azure, or Airflow) and their data services 6. Knowledge of data warehousing concepts and ETL best practices 7. Experience with version control systems (e.g., Git) 8. Understanding of data cleansing, data modeling and database design principles 9. Solid problem-solving skills and attention to detail 10. Good communication skills and ability to work with technical and non-technical team members.
- Experience with Azure data platform (ADF, Databrick) 2. Familiarity with data visualization tools (e.g., Tableau, Power BI) 3. Knowledge of stream processing technologies (e.g., Kafka, API, Google Big Query, MongoDB, SFTP sources) 4. Experience with containerization technologies (e.g., Docker).
- Experience to deal with large data and optimization skill in development.
- Understanding of machine learning concepts and data science workflows.
ทักษะ:
Compliance, Python, SQL
ประเภทงาน:
งานประจำ
เงินเดือน:
สามารถต่อรองได้
- Lead System Analyst/Senior Data Engineer assigns to work on IECC Project and Finance-Risk and Compliance Data initiatives to support solution design and data integration between upstream applications, downstream applications, and business users.
- To design, build, and operate reliable data pipelines across batch, near-real-time, and real-time workloads.
- To utilize multiple technologies (e.g. Python, SQL/Stored Procedures, ETL/ELT tools) to ingest, transform, and deliver governed, audit-ready data.
- To orchestrate and monitor jobs, implement data quality controls, and ensure security, lineage, and observability, while modernizing existing workflows with automation, testing, and performance tuning.
- Build and maintain ingestion, transformation, and delivery pipelines that produce governed, audit-ready datasets.
- Use Python, SQL/Stored Procedures, and ETL/ELT frameworks (or any relevant technologies) to implement scalable and reusable data pipeline components.
- Orchestrate and monitor workloads (e.g., DAGs/schedulers), ensuring reliability, idempotency and rerun ability.
- Enforce data quality (completeness, validity, accuracy, timeliness, uniqueness) and reconciliation checks.
- Ensure security and compliance: access control, PII handling, encryption, and audit logging.
- Design and manage workflow orchestration for reliable execution, monitoring, and failure recovery with Airflow/Control-M/ESP (DAGs, retries, backfills, idempotency).
- Collaborate with Architects/Stewards to apply a Shared Canonical Model (CDM) and data standards.
- Implement security controls (RBAC/ABAC), PII masking, encryption in-transit/at-rest, and auditable logs.
- Maintain runbooks, technical specifications (e.g. data mapping), and contribute to CI/CD (Git, artifacts, release notes).
- Monitor pipelines (SLIs/SLOs), diagnose incidents, and drive continuous performance and cost improvements.
- Promote data literacy and a data-driven culture through cross-functional collaboration..
- Apply now if you have these advantages.
- Bachelor's / Master degree in Computer Engineer, Computer Science, Information Technology, or related fields.
- At least 8-12 years as System Analyst / Data Engineer, 2-3 years in banking industry.
- Strong background in one or more: large-scale data processing, data infrastructure engineering, or data modeling.
- Solid grasp of CDC patterns, schema-drift control, robust error handling, and recovery/replay.
- Proven track record improving pipelines via automation, testing, and performance tuning.
- Exposure to cloud data platforms (AWS/Azure/GCP), Databricks/Spark Structured Streaming is a plus.
- Proficient in Python and SQL (or any relevant programming languages) and be able to apply solid software engineering practices (testing, version control, code reviews).
- Strong SQL (complex queries, optimization) and Python (DB-API/pandas or PySpark) comfortable with Unix shell.
- Experience with one or more: Talend, IBM DataStage, Airflow, Kafka, Spark, Trino/Presto.
- Curious, resilient, and critical thinker, open to feedback and continuous improvement.
- Financial services, risk and regulatory data experience (e.g., IECC, IFRS9, Basel, BOT, AML, Credit Risk, Compliance) is an advantage.
- Why join Krungsri?.
- As a part of MUFG (Mitsubishi UFJ Financial Group), we a truly a global bank with networks all over the world.
- We offer a striking work-life balance culture with hybrid work policies (2 days minimum in office per week).
- Unbelievable benefits such as attractive bonuses, employee loan with special rates and many more..
- Apply now before this role is close. **.
- FB: Krungsri Career(http://bit.ly/FacebookKrungsriCareer [link removed]).
- LINE: Krungsri Career (http://bit.ly/LineKrungsriCareer [link removed]).
- Talent Acquisition Department
- Bank of Ayudhya Public Company Limited
- 1222 Rama III Rd., Bangpongpang, Yannawa, Bangkok 10120.
- หมายเหตุ ธนาคารมีความจำเป็นและจะมีขั้นตอนการตรวจสอบข้อมูลส่วนบุคคลเกี่ยวกับประวัติอาชญากรรมของผู้สมัคร ก่อนที่ผู้สมัครจะได้รับการพิจารณาเข้าร่วมงานกับธนาคารกรุงศรีฯ.
- Remark: The bank needs to and will have a process for verifying personal information related to the criminal history of applicants before they are considered for employment with the bank..
- Applicants can read the Personal Data Protection Announcement of the Bank's Human Resources Function by typing the link from the image that stated below.
- EN (https://krungsri.com/b/privacynoticeen).
- ผู้สมัครสามารถอ่านประกาศการคุ้มครองข้อมูลส่วนบุคคลส่วนงานทรัพยากรบุคคลของธนาคารได้โดยการพิมพ์ลิงค์จากรูปภาพที่ปรากฎด้านล่าง.
- ภาษาไทย (https://krungsri.com/b/privacynoticeth).
ทักษะ:
Data Analysis, SQL, Problem Solving, English
ประเภทงาน:
งานประจำ
เงินเดือน:
สามารถต่อรองได้
- Working closely with business and technical domain experts to identify data requirements that are relevant for analytics and business intelligence.
- Implement data solutions and data comprehensiveness for data customers.
- Working closely with engineering to ensure data service solutions are ultimately delivered in a timely and cost effective manner.
- Retrieve and prepare data (automated if possible) to support business data analysis.
- Ensure adherence to the highest standards in data privacy protection and data governance.
- Bachelor s of Master s Degree in Computer Science, Computer Engineering, or related.
- Minimum of 1 Year with relational/non-relational database systems and good command in SQL.
- Ability to meet critical deadlines and prioritize multiple tasks in a fast-paced environment.
- Ability to work independently, have strong problem solving and organization skills, with a high initiative and a sense of accountability and ownership.
- Experience with cloud-based platforms such as AWS, Google Cloud platform or similar.
- Experience in data processing batch / real time / nearly realtime.
- Experience with data integration or ETL management tools such as AWS Glue, Databrick,or similar.
- Experience with web or software development with Java,Python or similar.
- Experience with Agile methodology is a plus.
- Good in communication and writing in English.
- Good interpersonal and communication skills.
ทักษะ:
ETL, Automation, Data Warehousing
ประเภทงาน:
งานประจำ
เงินเดือน:
สามารถต่อรองได้
- Design & Implement Data Platforms: Design, develop, and maintain robust, scalable data pipelines and ETL processes, with a focus on automation and operational excellence.
- Ensure Data Quality and Governance: Implement automated data validation, quality checks, and monitoring systems to ensure data accuracy, consistency, and reliability.
- Manage CI/CD for Data: Own and optimize the CI/CD pipelines for data engineering workflows, including automated testing and deployment of data transformations and schem ...
- Architect & Implement IaC: Use Infrastructure as Code (IaC) with Terraform to manage data infrastructure across various cloud platforms (Azure, AWS, GCP).
- Performance & Optimization: Proactively monitor and optimize query performance, data storage, and resource utilization to manage costs and enhance efficiency.
- Collaborate with Stakeholders: Manage communication with technical and business teams to understand requirements, assess technical and business impact, and deliver effective data solutions.
- Strategic Design: Possess the ability to see the big picture in architectural design, conduct thorough risk assessments, and plan for future scalability and growth.
- Experience: 1-3 years of experience in data engineering, data warehousing, and ETL processes, with a significant portion of that time focused on DataOps or a similar operational role..
- Platform Expertise: Strong experience with data platforms such as Databricks and exposure to multiple cloud environments (Azure, AWS, or GCP)..
- Data Processing: Extensive experience with Apache Spark for large-scale data processing..
- Orchestration: Experience working with data orchestration tools like Azure Data Factory (ADF), Apache Airflow, or similar..
- CI/CD & Version Control: knowledge of version control (Git) and experience with CI/CD pipelines (GitLab CI/CD, GitHub Actions)..
- IaC: hands-on experience with Terraform..
- Programming: Programming skills in Python and advanced proficiency in SQL.
- Soft Skills: Strong stakeholder management, communication, and collaboration skills. The ability to articulate complex technical concepts to non-technical audiences is a must..
- Problem-Solving: Strong problem-solving skills with an ability to analyze technical challenges and their business impact..
- Data Modeling: Experience with data modeling tools and methodologies, specifically with dbt (data build tool)..
- AI & ML: Experience with AI-related technologies like Retrieval-Augmented Generation (RAG) and frameworks such as LangChain..
- Data Observability: Hands-on experience with data quality and observability tools such as Great Expectations, Monte Carlo, or Soda Core..
- Data Governance: Familiarity with data governance principles, compliance requirements, and data catalogs (e.g., Unity Catalog)..
- Streaming Technologies: Experience with stream processing technologies like Kafka or Flink..
- Containerization: Familiarity with containerization and orchestration tools (e.g., Docker, Kubernetes)..
- Open Source: Contributions to open-source projects or relevant certifications..
- CP AXTRA | Lotus's
- CP AXTRA Public Company Limited.
- Nawamin Office: Buengkum, Bangkok 10230, Thailand.
- By applying for this position, you consent to the collection, use and disclosure of your personal data to us, our recruitment firms and all relevant third parties for the purpose of processing your application for this job position (or any other suitable positions within Lotus's and its subsidiaries, if any). You understand and acknowledge that your personal data will be processed in accordance with the law and our policy.".
ทักษะ:
Big Data, SQL, Hadoop
ประเภทงาน:
งานประจำ
เงินเดือน:
สามารถต่อรองได้
- Develop and maintain robust data pipelines to ingest, process, and transform raw data into formats suitable for LLM training.
- Conduct meeting with users to understand the data requirements and perform database design based on data understanding and requirements with consideration for performance.
- Maintain data dictionary, relationship and its interpretation.
- Analyze problem and find resolution, as well as work closely with administrators to monitor performance and advise any necessary infrastructure changes.
- Work with business domain experts, data scientists and application developers to identify data that is relevant for analysis.
- Develop big data solutions for batch processing and near real-time streaming.
- Own end-to-end data ETL/ELT process framework from Data Source to Data warehouse.
- Select and integrate appropriate tools and frameworks required to provide requested capabilities.
- Design and develop BI solutions.
- Hands-on development mentality, with a willingness to troubleshoot and solve complex problems.
- Keep abreast of new developments in the big data ecosystem and learn new technologies.
- Ability to effectively work independently and handle multiple priorities.
- Bachelor degree or higher in Computer Science, Computer Engineering, Information Technology, Management Information System or an IT related field.
- 3+ year's experiences in Data Management or Data Engineer (Retail or E-Commerce business is preferrable).
- Expert experience in query language (SQL), Databrick SQL, PostgreSQL.
- Experience in Big Data Technologies like Hadoop, Apache Spark, Databrick.
- Experience in Python is a must.
- Experience in Generative AI is a must.
- Knowledge in machine/statistical learning, data mining is a plus.
- Strong analytical, problem solving, communication and interpersonal skills.
- Having good attitude toward team working and willing to work hard.
- CP AXTRA | Lotus's
- CP AXTRA Public Company Limited.
- Nawamin Office: Buengkum, Bangkok 10230, Thailand.
- By applying for this position, you consent to the collection, use and disclosure of your personal data to us, our recruitment firms and all relevant third parties for the purpose of processing your application for this job position (or any other suitable positions within Lotus's and its subsidiaries, if any). You understand and acknowledge that your personal data will be processed in accordance with the law and our policy. .
ประเภทงาน:
งานประจำ
เงินเดือน:
สามารถต่อรองได้
- The Senior Data Engineer position plays a key role in designing, developing, and managing cloud-based data platforms, as well as creating data structures for high-level data analysis, and works with business and technical teams to ensure that data management is appropriate and supports organizational goals.
- Responsible for the design, construction, and maintenance of optimal and scalable data pipeline architectures on cloud platforms (e.g., GCP, AWS, Azure).
- Oversee the development and management of complex ETL/ELT processes for data ingesti ...
- Author and optimize advanced, high-performance SQL queries for complex data transformation, aggregation, and analysis.
- Leverage the Python programming language for automation, scripting, and the development of data processing frameworks.
- Administer and optimize cloud-based data warehouse solutions and associated data lakes.
- Collaborate professionally with data scientists, analysts, and key business stakeholders to ascertain data requirements and deliver effective technical solutions.
- Provide mentorship to junior engineers and champion the adoption of data engineering best practices throughout the organization.
- Bachelor s degree or higher in Computer Science, Information Technology, Engineering, or a related field.
- At least 5 years of experience working in a data engineering or related position.
- Proficient in advanced SQL, including query optimization and performance tuning.
- Experienced in managing and designing architecture on at least one major cloud platform (Google Cloud Platform, AWS, or Azure).
- Skilled in using Python for data processing and advanced pipeline development.
- Experienced with tools and technologies for data ingestion, connectivity, and management.
- Deep understanding of data modeling principles, data warehousing methodologies, and modern data architecture.
- Excellent analytical and problem-solving skills.
- Communication and teamwork skills.
- Ability to plan and manage tasks effectively.
ประสบการณ์:
5 ปีขึ้นไป
ทักษะ:
Python, ETL, Java
ประเภทงาน:
งานประจำ
เงินเดือน:
สามารถต่อรองได้
- Design and implement scalable, reliable, and efficient data pipelines for ingesting, processing, and storing large amounts of data from a variety of sources using cloud-based technologies, Python, and PySpark.
- Build and maintain data lakes, data warehouses, and other data storage and processing systems on the cloud.
- Write and maintain ETL/ELT jobs and data integration scripts to ensure smooth and accurate data flow.
- Implement data security and compliance measures to protect data privacy and ensure regulatory compliance.
- Collaborate with data scientists and analysts to understand their data needs and provide them with access to the required data.
- Stay up-to-date on the latest developments in cloud-based data engineering, particularly in the context of Azure, AWS and GCP, and proactively bring new ideas and technologies to the team.
- Monitor and optimize the performance of data pipelines and systems, identifying and resolving any issues or bottlenecks that may arise.
- Bachelor s or Master s degree in Computer Science, Data Science, or a related field.
- Minimum of 5 years of experience as a Data Engineer, with a strong focus on cloud-based data infrastructure.
- Proficient programming skills in Python, Java, or a similar language, with an emphasis on Python.
- Extensive experience with cloud-based data storage and processing technologies, particularly Azure, AWS and GCP.
- Familiarity with ETL/ELT tools and frameworks such as Apache Beam, Apache Spark, or Apache Flink.
- Knowledge of data modeling principles and experience working with SQL databases.
- Strong problem-solving skills and the ability to troubleshoot and resolve issues efficiently.
- Excellent communication and collaboration skills to work effectively with cross-functional teams.
- Location: True Digital Park, Bangkok (Hybrid working).
ประสบการณ์:
6 ปีขึ้นไป
ทักษะ:
ETL, SQL, DevOps, English
ประเภทงาน:
งานประจำ
เงินเดือน:
สามารถต่อรองได้
- Build and run workloads on Azure Data Factory (ADF) as ETL.
- Design of cloud architecture included as core component ADF.
- Creation of pipelines / linked services / development data flows on ADF.
- Expertise on Azure SQL Database (permissions management, role, authentication).
- Creation of advanced scripts for Azure SQL Database.
- Setup and Configuration of Azure storages Accounts (Blob container, tables, files).
- Setup and Configuration of Azure key vaults.
- Data expertise support to development teams.
- Securing data-related Azure resources (Storage Account, SQL database, key vaults, managed identity).
- Innovative activity with POC implementation to discover new Azure Data workloads.
- Advanced Azure DevOps pipeline configuration.
- Have experience in leading and taking decision independently.
- Should have experience as Data Architect in Implementing the solution.
- Qualifications Bachelor's degree or higher in related field.
- Proven experience 6-9 years of relevant experience.
- Proactive, rigorous & committed.
- Good synthesis skills and ability to convey information effectively.
- Quick learning on projects and ability to work on multiple projects simultaneously.
- Fluent English communication skills.
- Azure Data Factory.
- Azure Managed Services (PaaS): Azure SQL Database, Storage Account, Logic Apps.
- Azure Data Architecture.
- Azure DevOps.
- Azure Key Vault.
- ARM Templates, Terraform.
- Tableau Software / Power BI.
- MS SQL Server 2016/2019 (Data Engine, Analysis Services - Cubes Multidimensional).
- Additional Information This position is based in Bangkok.
- The role involves global collaboration, requiring flexibility to accommodate time zone differences..
ประสบการณ์:
3 ปีขึ้นไป
ทักษะ:
ETL, Apache, Python, English
ประเภทงาน:
งานประจำ
เงินเดือน:
สามารถต่อรองได้
- Amaris Consulting is an independent technology consulting firm providing guidance and solutions to businesses. With more than 1000 clients across the globe, we have been rolling out solutions in major projects for over a decade - this is made possible by an international team of 7,600 people spread across 5 continents and more than 60 countries. Our solutions focus on four different Business Lines: Information System & Digital, Telecom, Life Sciences and Engineering. We're focused on building and nurturing a top talent community where all our team members can achieve their full pot ...
- Brief Call: Our process typically begins with a brief virtual/phone conversation to get to know you! The objective? Learn about you, understand your motivations, and make sure we have the right job for you!
- Interviews (the average number of interviews is 3 - the number may vary depending on the level of seniority required for the position). During the interviews, you will meet people from our team: your line manager of course, but also other people related to your future role. We will talk in depth about you, your experience, and skills, but also about the position and what will be expected of you. Of course, you will also get to know Amaris: our culture, our roots, our teams, and your career opportunities!
- Case study: Depending on the position, we may ask you to take a test. This could be a role play, a technical assessment, a problem-solving scenario, etc.
- As you know, every person is different and so is every role in a company. That is why we have to adapt accordingly, and the process may differ slightly at times. However, please know that we always put ourselves in the candidate's shoes to ensure they have the best possible experience.
- We look forward to meeting you!
- Design and optimize data pipelines and ETL/ELT workflows using Databricks and Apache Spark.
- Build and maintain data models and data lakes to support analytics and reporting.
- Develop reusable Python code for transformation, orchestration, and automation.
- Implement and tune complex PySpark and SQL queries for large-scale data processing.
- Collaborate with Data Scientists, Analysts, and Business Units to deliver scalable solutions.
- Ensure data quality, governance, and metadata management across projects.
- Manage Azure cloud services for data infrastructure and deployment.
- Support daily operations and performance of the Databricks platform.
- ABOUT YOU
- 3+ years of experiences in Data Engineering.
- Experience with Databricks, Unity Catalog, Apache Spark, and distributed data processing.
- Strong proficiency in Python, PySpark, SQL.
- Knowledge of data warehousing concepts, data modeling, and performance optimization.
- Experience with Azure cloud data platforms (e.g., Azure Synapse).
- Familiarity with CI/CD and version control (Git, BitBucket).
- Understanding of real-time data streaming and tools such as Qlik for replication.
- Academic background: Bachelor's or Master's in Computer Science, Engineering, or related field.
- Fluent English. Another language is a plus.
- You have excellent problem-solving skills and can work independently as well as in a team.
- WHY AMARIS?
- Global Diversity: Be part of an international team of 110+ nationalities, celebrating diverse perspectives and collaboration.
- Trust and Growth: With 70% of our leaders starting at entry-level, we're committed to nurturing talent and empowering you to reach new heights.
- Continuous Learning: Unlock your full potential with our internal Academy and over 250 training modules designed for your professional growth.
- Vibrant Culture: Enjoy a workplace where energy, fun, and camaraderie come together through regular afterworks, team-building events, and more.
- Meaningful Impact: Join us in making a difference through our CSR initiatives, including the WeCare Together program, and be part of something bigger.
- Equal opportunity
- Amaris Consulting is proud to be an equal opportunity workplace. We are committed to promoting diversity within the workforce and creating an inclusive working environment. For this purpose, we welcome applications from all qualified candidates regardless of gender, sexual orientation, race, ethnicity, beliefs, age, marital status, disability or other characteristics.
ทักษะ:
Sales, Hadoop, ETL, English
ประเภทงาน:
งานประจำ
เงินเดือน:
สามารถต่อรองได้
- Bachelor's degree or equivalent practical experience.
- 10 years of experience in software sales or account management.
- Experience promoting analytics, data warehousing, or data management software.
- Ability to communicate fluently in English and Thai to support APAC customers.
- Experience with business intelligence front-end, data analytics middleware, or back-end data warehouse technologies.
- Experience working with sales engineers and customer technical leads to build business cases for transformation and accompanying plans for implementation.
- Understanding of data analytics technology stack (e.g., Hadoop/Spark, Columnar data warehouses, data streaming, ETL and data governance, predictive analytics, data science framework, etc.).
- Understanding of Google Cloud Data and Analytics offerings (e.g., BigQuery, Looker, Dataproc, Pub/Sub, etc.).
- Ability to engage and influence executive stakeholders as a business advisor and thought leader in data and analytics.
- Excellent business acumen and problem-solving skills.
- As a member of the Google Cloud team, you inspire leading companies, schools, and government agencies to work smarter with Google tools like Google Workspace, Search, and Chrome. You advocate for the innovative power of our products to make organizations more productive, collaborative, and mobile. Your guiding light is doing what s right for the customer, you will meet customers exactly where they are at and provide them the best solutions for innovation. Using your passion for Google products, you help spread the magic of Google to organizations around the world.
- In this role, you will build an understanding of our customers businesses and bring expertise to executive-level relationships to help them deliver their strategies. You will leverage expertise promoting data analytics and work with account teams, customer engineering, and partners to ensure customer outcomes.
- Google Cloud accelerates every organization s ability to digitally transform its business and industry. We deliver enterprise-grade solutions that leverage Google s cutting-edge technology, and tools that help developers build more sustainably. Customers in more than 200 countries and territories turn to Google Cloud as their trusted partner to enable growth and solve their most critical business problems.
- Calibrate business against the objectives and results, forecast and report the state of the business for the assigned territory.
- Build and maintain executive relationships with customers as the data analytics subject matter expert, influencing direction.
- Develop and execute account plans, including a broader enterprise plan across industries. Focus on building accounts.
- Assist customers in identifying use cases suitable for Google Cloud Data and Analytics solutions, articulating solution differentiation and business impacts.
- Work with Google account and technical teams to develop and drive pipelines, and provide expertise. Develop Go-To-Market (GTM) efforts with Google Cloud Platform partners.
- Google is proud to be an equal opportunity workplace and is an affirmative action employer. We are committed to equal employment opportunity regardless of race, color, ancestry, religion, sex, national origin, sexual orientation, age, citizenship, marital status, disability, gender identity or Veteran status. We also consider qualified applicants regardless of criminal histories, consistent with legal requirements. See also Google's EEO Policy and EEO is the Law. If you have a disability or special need that requires accommodation, please let us know by completing our Accommodations for Applicants form.
ประสบการณ์:
5 ปีขึ้นไป
ทักษะ:
ETL, Quantitative Analysis, Industry trends
ประเภทงาน:
งานประจำ
เงินเดือน:
สามารถต่อรองได้
- Translating business requirements to technical solutions leveraging strong business acumen.
- You will be a core member of the EY Microsoft Data and AI team, responsible for extracting large quantities of data from client s IT systems, developing efficient ETL and data management processes, and building architectures for rapid ingestion and dissemination of key data.
- Apply expertise in quantitative analysis, data mining and presentation of data to de ...
- Extremely flexible and experience managing multiple tasks and priorities on deadlines.
- Applying technical knowledge to architect solutions that meet business and IT needs, create Data Platform roadmaps, and enable the Data Platforms to scale to support additional use cases.
- Staying abreast of current business and industry trends relevant to the client's business.
- Monitoring progress, managing risk, and ensuring key stakeholders are kept informed about progress and expected outcomes.
- Understanding customers overall data estate, IT and business priorities and success measures to design implementation architectures and solutions.
- Strong team collaboration and experience working with remote teams.
- Working on large-scale client engagements. Fostering relationships with client personnel at appropriate levels. Consistently delivering quality client services. Driving high-quality work products within expected timeframes and on budget.
- Demonstrated significant professional experience of commercial, strategy and/or research/analytics interacting with senior stakeholders to effectively communicate insights.
- Execute on building data solutions for business intelligence and assist in effectively managing and monitoring the data ecosystem of analytics, data lakes, warehouses platforms and tools.
- Provide directional guidance and recommendations around data flows including data technology, data integrations, data models, and data storage formats.
- To qualify for the role, you must have.
- Bachelor s degree, or MS degree in Business, Economics, Technology Entrepreneurship, Computer Science, Informatics, Statistics, Applied Mathematics, Data Science, or Machine Learning.
- Minimum of 3-5 years of relevant consulting experience with focus on advanced analytics and business intelligence or similar roles. New graduated are welcome!.
- Communication and critical thinking are essential, must be able to listen and understand the question and develop and deliver clear insights.
- Experience communicating the results of analysis to both technical and non-technical audiences.
- Independent and able to manage and prioritize workload.
- Ability to adapt quickly and positively to change.
- Breadth of technical passion, desire to learn and knowledge services.
- Willingness and ability to travel to meet client if need.
- Ideally, you ll also have.
- Experience working business or IT transformation projects that have supported data science, business intelligence, artificial intelligence, and cloud applications at scale.
- Ability to communicate clearly and succinctly, adjusts to a variety of styles and audiences with ability to tell compelling stories with the data.
- Experience with C#, VBA, JavaScript, R.
- A vast understanding of key BI trends and the BI vendor landscape.
- Working experience with Agile and/or Scrum methods of delivery.
- Working experience with design led thinking.
- Microsoft Certifications in the Data & AI domain.
- We re interested in passionate leaders with strong vision and a desire to deeply understand the trends at the intersection of business and Data and AI. We want a customer-focused professional who is motivated to drive the creation of great enterprise products and who can collaborate and partner with other product teams, and engineers. If you have a genuine passion for helping businesses achieve the full potential of their data, this role is for you.
- What we offer.
- We offer a competitive compensation package where you ll be rewarded based on your performance and recognized for the value you bring to our business. In addition, our Total Rewards package includes medical and dental coverage, pension and 401(k) plans, and a wide range of paid time off options. Under our flexible vacation policy, you ll decide how much vacation time you need based on your own personal circumstances. You ll also be granted time off for designated EY Paid Holidays, Winter/Summer breaks, Personal/Family Care, and other leaves of absence when needed to support your physical, financial, and emotional well-being.
- Continuous learning: You ll develop the mindset and skills to navigate whatever comes next.
- Success as defined by you: We ll provide the tools and flexibility, so you can make a meaningful impact, your way.
- Transformative leadership: We ll give you the insights, coaching and confidence to be the leader the world needs.
- Diverse and inclusive culture: You ll be embraced for who you are and empowered to use your voice to help others find theirs.
- If you can demonstrate that you meet the criteria above, please contact us as soon as possible.
- The exceptional EY experience. It s yours to build.
- EY | Building a better working world.
- EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets.
- Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate.
- Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.
- EY is an equal opportunity, affirmative action employer providing equal employment opportunities to applicants and employees without regard to race, color, religion, age, sex, sexual orientation, gender identity/expression, national origin, protected veteran status, disability status, or any other legally protected basis, in accordance with applicable law.
ทักษะ:
Sales, Salesforce, Java
ประเภทงาน:
งานประจำ
เงินเดือน:
สามารถต่อรองได้
- Design and develop salesforce solutions based on customer requirements.
- Implement User Stories in an agile approach for CRM Systems or Salesforce applications.
- Analyses technical requirements and translates them into implementable application designs and configurations. Evaluates possibilities with different technologies and platforms.
- Understand client business process and potential constraints (budget, timeline, expertise, etc.) to define optimal and reasonable project scope and expectations.
- Create and defend solution estimate and SOW.
- Coordinate with Solutions Architects on integration, and data consultants, and others, as needed, for specific technical design requirements.
- At Deloitte, we believe in the importance of empowering our people to be leaders at all levels. We connect our purpose and shared values to identify issues as well as to make an impact that matters to our clients, people and the communities. Additionally, Senior Consultants across our Firm are expected to:Develop diverse, high-performing people and teams through new and meaningful development opportunities.
- Collaborate effectively to build productive relationships and networks.
- Understand and lead the execution of key objectives and priorities for internal as well as external stakeholders.
- Align your team to key objectives as well as set clear priorities and direction.
- Make informed decisions that positively impact the sustainable financial performance and enhance the quality of outcomes.
- Influence stakeholders, teams, and individuals positively - leading by example and providing equal opportunities for our people to grow, develop and succeed.
- Lead with integrity and make a strong positive impact by energising others, valuing individual differences, recognising contributions, and inspiring self-belief.
- Deliver superior value and high-quality results to stakeholders while driving high performance from people across Deloitte.
- Apply their understanding of disruptive trends and competitor activity to recommend changes, in line with leading practices.
- Requirements:6+ years CRM experience with a minimum of 3 years on the Salesforce core platform and Salesforce Marketing Cloud.
- At least 3 full life-cycle Salesforce implementation with strong expertise as well as certifications in the following modules: Sales Cloud, Service Cloud, Marketing Cloud, Community Cloud, Force.com, Apttus.
- Development and troubleshooting experience with Salesforce (Apex, Visualforce, Lightning, Java/C#/OOP, Javascript/JQuery, Angular, JS/Bootstrap, SQL/SOQL, Web Services) will be preferred.
- Strong understanding of Agile / Iterative delivery methodology.
- Knowledge of data integration tools and experience integrating Salesforce with different business systems (ETL, CPQ, marketing automation, reporting, etc.).
- Understanding of systems architecture and ability to design scalable performance-driven solutions.
- Familiarity with platform authentication patterns (SAML, SSO, OAuth).
- Strong understanding of environment management, release management, code versioning best practices, and deployment methodologies.
- Responsible for deliverable for project. Capacity plan for specific plan, managing the development team.
- Ensure utilization of staff is optimized by tracking individual team member forecast.
- Allocating resources and responsibilities across the team to deliver business results and develop team members.
- Responsible for supporting quality programs throughout the entire SDLC period.
- An appreciation of the consulting lifestyle and ability to travel (both locally and abroad) is a pre-requisite to fit to our short-term and long-term project assignment.
- Due to volume of applications, we regret that only shortlisted candidates will be notified.
- Please note that Deloitte will never reach out to you directly via messaging platforms to offer you employment opportunities or request for money or your personal information. Kindly apply for roles that you are interested in via this official Deloitte website. Requisition ID: 109808In Thailand, the services are provided by Deloitte Touche Tohmatsu Jaiyos Co., Ltd. and other related entities in Thailand ("Deloitte in Thailand"), which are affiliates of Deloitte Southeast Asia Ltd. Deloitte Southeast Asia Ltd is a member firm of Deloitte Touche Tohmatsu Limited. Deloitte in Thailand, which is within the Deloitte Network, is the entity that is providing this Website.
ทักษะ:
Compliance, Assembly, Web Services
ประเภทงาน:
งานประจำ
เงินเดือน:
สามารถต่อรองได้
- Collaborates with stakeholders to understand business requirements and translate them into scalable, secure, and cost-effective cloud solutions.
- Viewed as a trusted technical advisor to the client and ensure technical solutions will accomplish the client's objectives.
- Designs and architects less complex cloud-based systems, ensuring high availability, scalability, performance, and reliability.
- Provides pre-sales technical support and know-how in analyzing client requirements, in conjunction with the client s current collaboration capabilities.
- Assesses existing systems and develop migration strategies to transition on-premises applications and infrastructure to the cloud.
- Designs integration solutions to enable seamless data flow between cloud and on-premises environments.
- Defines and implements security best practices and compliance standards for cloud-based systems.
- Supports the development and maintain cloud governance frameworks, policies, and procedures.
- Provides support to development teams to ensure adherence to cloud architecture standards and best practices.
- Develops or produces the technical design document to match the solution design specifications.
- Working with the relevant internal stakeholders, participate in scope of work determination, product pricing and RFP/RFI responses.
- Assists with the determination of outsourcing, product pricing and collaborates with others to develop an implementation solution.
- Influences and guides members of the Sales team and to ensure that they are equipped to close deals and maintain visibility of forecasting and sales pipeline in order to influence potential deals.
- Manages client proof of concept (POC) initiatives, which will require the involvement of the appropriate resources, and setup and delivery of the POC.
- On all assigned engagements, owns the proposed solution and transitions the build / implementation to the delivery team.
- Serve as a subject matter expert on cloud technologies and architectures.
- Collaborate with cross-functional teams, including developers, operations, and project managers, to ensure alignment of technical solutions with business objectives.
- Specifically relating to opportunity pursuit this role will evaluate each opportunity for alignment with organizational capabilities and business policy, prepare the executive summary that outlines all of the information gathered from the client in regards to their needs, as understood, document the proposed technology solution, document the statement of work along with all labor requirements, work with the relevant internal stakeholders to prepare the pricing format that will be supplied to the customer, perform the actual solution design and prepare a parts list outlining equipment to be provided, develop and manage a proof-of-concept as such may be required, engage all technical resources required for an accurate solution design, prepare a network diagram outlining the proposed solution, document all deliverables and what constitutes a successful completion, review the final parts list as supplied and submit all information to the applicable bid team for final assembly, verify the proposal s accuracy and sign off on the final documents to be presented to the client, assist during the final presentation to the client as appropriate.
- Demonstrates good client engagement skills coupled with technical consulting aptitude.
- Understanding of the vendor s products business and technology positioning.
- Ability to collaborate and communicate effectively with team members, contributing to their success.
- Broad product knowledge integrated with technology understanding.
- Good knowledge of cloud architecture patterns, including microservices, serverless computing, containers, and hybrid cloud deployments.
- Proficiency in cloud infrastructure technologies, such as virtual machines, storage solutions, networking, and load balancing.
- In-depth understanding of cloud security principles, including identity and access management, encryption, and compliance frameworks.
- Familiarity with IaC tools and frameworks such as Terraform, AWS CloudFormation, Azure Resource Manager, or Google Cloud Deployment Manager.
- Understanding cloud design patterns, microservices, serverless computing, containers, and hybrid cloud deployments.
- Multi-faceted knowledge such as Amazon Web Services (AWS), Microsoft Azure, Google Cloud Platform (GCP), or other providers, understanding the specific services offered by each platform, including compute, storage, databases, networking, and security.
- Understanding of cloud networking concepts, including virtual networks, subnets, routing, load balancing, and firewall configurations.
- Knowledge of VPNs, VPC peering, and hybrid connectivity options between on-premises and cloud environments.
- Knowledge of identity and access management (IAM), encryption, data protection, secure network configurations, and compliance frameworks such as GDPR, HIPAA, or PCI-DSS.
- Proficiency in cloud storage solutions such as Amazon S3, Azure Blob Storage, or Google Cloud Storage.
- Understanding of different database options including relational databases (e.g., Amazon RDS, Azure SQL Database) and NoSQL databases for example, Amazon DynamoDB, Azure Cosmos DB).
- Knowledge of cloud monitoring and management tools such as AWS CloudWatch, Azure Monitor, or Google Cloud Monitoring.
- Familiarity with DevOps principles and practices, including continuous integration and continuous deployment (CI/CD).
- Knowledge of integration technologies such as API gateways, messaging queues, and ETL (Extract, Transform, Load) processes Basic understanding of key vendor subscription models such as Cisco EA 3.0.
- Bachelor's degree in information technology, computer science or information systems or related field.
- Vendor product, sales and technology certifications.
- Relevant cloud certifications such as AWS Certified Solutions Architect, Microsoft Certified: Azure Solutions Architect or Google Cloud Certified - Professional Cloud Architect.
- Moderate level technical experience within a large scale (preferably multi-national) technology services environment.
- Moderate level experience as a Cloud Technical Architect or a similar role, designing and implementing cloud architectures for complex systems and applications.
- Moderate level experience in designing, implementing, and managing cloud-based solutions, preferably using leading cloud platforms such as Amazon Web Services (AWS), Microsoft Azure, or Google Cloud Platform (GCP).
- Moderate level experience in project management methodologies.
- Moderate level experience with major cloud platforms.
- Moderate level experience with serverless computing platforms like AWS Lambda, Azure Functions, or Google Cloud Functions.
- Moderate level experience with automation and orchestration tools like Ansible, Chef, or Puppet to streamline the provisioning, configuration, and management of cloud resources.
- On-site Working About NTT DATA
- NTT DATA is a $30+ billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long-term success. We invest over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies. Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure, and connectivity. We are also one of the leading providers of digital and AI infrastructure in the world. NTT DATA is part of NTT Group and headquartered in Tokyo.
- Equal Opportunity Employer
- NTT DATA is proud to be an Equal Opportunity Employer with a global culture that embraces diversity. We are committed to providing an environment free of unfair discrimination and harassment. We do not discriminate based on age, race, colour, gender, sexual orientation, religion, nationality, disability, pregnancy, marital status, veteran status, or any other protected category. Join our growing global team and accelerate your career with us. Apply today.
ทักษะ:
Power BI, Tableau, SQL
ประเภทงาน:
งานประจำ
เงินเดือน:
สามารถต่อรองได้
- Data Lifecycle Management: A Data PM oversees the entire lifecycle of data projects, from data acquisition, integration, and storage to analysis and visualization. This involves significant understanding of technical processes and data systems.
- Collaboration with Technical Teams: Work closely with engineers, data scientists, and IT teams to ensure data pipelines and infrastructures align with project goals. This requires a deep understanding of technical jargon, workflows, and dependencies.
- Monitoring and Reporting: Track project progress and provide regular updates and rep ...
- Mastery of Technical Tools and Platforms.
- BI Tools: Power BI, Tableau.
- Project & Code Collaboration: GitHub, JIRA, Confluence.
- Cloud Systems: AWS, Azure, Google Cloud for data solutions.
- The role often requires working knowledge of SQL, Python, or other languages to interpret project outcomes, test processes, and validate results.
- Technical Decision-Making Authority.
- System Architecture: A Data PM may decide how data systems should be architected or what infrastructure to adopt based on project requirements.
- Tool Selection: Selecting appropriate analytics tools, databases, or platforms for project success is a regular part of the role.
- Ensuring Data Integrity: Data governance, accuracy, and validation are all technical concerns within a Data PM s purview.
- Challenges of the Data PM Role.
- Translate business needs into data requirements.
- Collaborate meaningfully with technical teams on implementation.
- Ensure compliance with technical standards (e.g., data security, privacy).
- Bachelor's or master's degree in a relevant field, such as Data Science, Computer Science, Business Administration, or Supply Chain Management.
- 3-5 years of experience in Project Management or a similar role, preferably in Data or IT domains.
- Strong understanding of Retail, Wholesale, or Supply Chain processes.
- Proficient in project management tools like Jira, Trello, or Microsoft Project.
- Experience with Agile/Scrum methodologies.
- Familiarity with.
- Data Analytics: SQL, Python, or other languages to interpret project outcomes.
- Data Engineering: Data pipelines, ETL processes, data storage systems.
- Data Science: Algorithms, machine learning models, statistical analysis, A/B testing.
- Technical Roadblocks: Anticipating and resolving issues like system integration, latency, and scalability.
- Excellent communication and stakeholder management skills.
- Proficient in English communication.
- CP AXTRA | Lotus's
- CP AXTRA Public Company Limited.
- Nawamin Office: Buengkum, Bangkok 10230, Thailand.
- By applying for this position, you consent to the collection, use and disclosure of your personal data to us, our recruitment firms and all relevant third parties for the purpose of processing your application for this job position (or any other suitable positions within Lotus's and its subsidiaries, if any). You understand and acknowledge that your personal data will be processed in accordance with the law and our policy.".
ประสบการณ์:
3 ปีขึ้นไป
ทักษะ:
ERP, SQL, Power BI, Data Analysis, Business Statistics / Analysis, Thai, English
ประเภทงาน:
งานประจำ
เงินเดือน:
฿40,000 - ฿60,000, สามารถต่อรองได้
- Participate with customer and business users to gather requirements and summarize key points for project development.
- Integrate data from various business systems/sources (e.g., ERP, CRM, Other) to ensure comprehensive and accurate analysis.
- Design and develop ETL (Extract, Transform, Load) processes to extract data from various sources.
- Prepare the data so it is ready for analysis. Data cleaning also involves handling missing and inconsistent data that may affect your analysis.
- Design, Develop and implement data models to serve customer requirement.
- Design, Build and maintain dashboards and reports that visualize key metrics and performance indicators..
- Bachelor Degree in Data Analytic, Computer Engineering, Computer Science, MIS, Statistics or related fields.
- At least 2-3 years experience in Data Analysis or Business Analysis.
- Proficiency in data tools such as SQL, Excel, BI (Power BI and Tableau), or Python.
- Experience in ERP, CRM, Cloud Technology, Software Development is preferred.
- Very good problem solving, negotiation, presentation and communication skill.
- Good written and verbal English communication.
- A collaborative team player with effective communication abilities..
- 1