- No elements found. Consider changing the search query.
Skills:
ETL, Big Data, SQL
Job type:
Full-time
Salary:
negotiable
- Design and develop ETL solutions using data integration tools for interfacing between source application and the Enterprise Data Warehouse.
- Experience with Big Data or data warehouse.
- Analyze & translate functional specifications & change requests into technical specifications.
- Experience in SQL programming in one of these RDBMSs such as Oracle.
- Develops ETL technical specifications, designs, develops, tests, implements, and supports optimal data solutions.
- Develops Documents ETL data mappings, data dictionaries, processes, programs and solutions as per established standards for data governance.
- Design and create codes for all related data extraction, transformation and loading (ETL) into database under responsibilities.
- Creates, executes, and documents unit test plans for ETL and data integration processes and programs.
- Perform problem assessment, resolution and documentation in existing ETL packages, mapping and workflows in production.
- Performance tuning of the ETL process and SQL queries and recommend and implement ETL and query tuning techniques.
- Qualifications Bachelor s degree in Information Technology, Computer Science, Computer Engineering, or a related field.
- Experience with CI/CD tools (e.g., Jenkins, GitLab CI/CD) and automation frameworks.
- Proficiency in cloud platforms such as AWS, Azure, and associated services.
- Knowledge of IAC tools like Terraform or Azure ARM.
- Familiarity with monitoring and logging tools like Prometheus, Grafana, ELK stack, APM, etc.
- Good understanding of IT Operations.
- Strong problem-solving skills and the ability to troubleshoot complex issues.
- Excellent communication and teamwork skills to collaborate effectively across various teams.
- We're committed to bringing passion and customer focus to the business. If you like wild growth and working with happy, enthusiastic over-achievers, you'll enjoy your career with us.
Skills:
Data Analysis, ETL, Data Warehousing
Job type:
Full-time
Salary:
negotiable
- Data Architecture: Design, develop, and maintain the overall data architecture and data pipeline systems to ensure efficient data flow and accessibility for analytical purposes.
- Data Integration: Integrate data from multiple sources, including point-of-sale systems, customer databases, e-commerce platforms, supply chain systems, and other relevant data sources, ensuring data quality and consistency.
- Data Modeling: Design and implement data models that are optimized for scalability, ...
- Data Transformation and ETL: Develop and maintain efficient Extract, Transform, and Load (ETL) processes to transform raw data into a structured format suitable for analysis and reporting.
- Data Warehousing: Build and maintain data warehouses or data marts that enable efficient storage and retrieval of structured and unstructured data for reporting and analytics purposes.
- Data Governance and Security: Establish and enforce data governance policies and procedures, including data privacy and security measures, to ensure compliance with industry regulations and protect sensitive data.
- Data Quality and Monitoring: Implement data quality checks and monitoring mechanisms to identify and resolve data inconsistencies, anomalies, and issues in a timely manner.
- Collaboration: Collaborate with cross-functional teams, including data scientists, business analysts, and software engineers, to understand their data needs, provide data solutions, and support their analytical initiatives.
- Performance Optimization: Optimize data processing and query performance to ensure efficient data retrieval and analysis, considering factors such as data volume, velocity, and variety.
- Documentation: Maintain documentation of data processes, data flows, data models, and system configurations, ensuring accuracy and accessibility for future reference.
- Bachelor's or master's degree in computer science, information systems, or a related field.
- Strong programming skills in languages such as Python, SQL. C++ is plus.
- At least 5 year experience with data modeling, database design, and data warehousing concepts.
- Proficiency in working with relational databases (e.g., MySQL, PostgreSQL) and big data technologies (e.g., Hadoop, Spark, Hive).
- Familiarity with cloud-based data platforms, such as AWS.
- Knowledge of ETL tools and techniques for data integration and transformation.
- Understanding of data governance, data security, and regulatory compliance requirements.
- Excellent problem-solving skills and attention to detail.
- Strong communication and interpersonal skills to collaborate effectively with cross-functional teams.
- Ability to work in a fast-paced environment and handle multiple projects simultaneously.
- Location: BTS Ekkamai
- Working Day: Mon-Fri.
Skills:
Big Data, ETL, SQL
Job type:
Full-time
Salary:
negotiable
- Develop and maintain robust data pipelines to ingest, process, and transform raw data into formats suitable for LLM training.
- Conduct meeting with users to understand the data requirements and perform database design based on data understanding and requirements with consideration for performance.
- Maintain data dictionary, relationship and its interpretation.
- Analyze problem and find resolution, as well as work closely with administrators to monitor performance and advise any necessary infrastructure changes.
- Work with business domain experts, data scientists and application developers to identify data that is relevant for analysis.
- Develop big data solutions for batch processing and near real-time streaming.
- Own end-to-end data ETL/ELT process framework from Data Source to Data warehouse.
- Select and integrate appropriate tools and frameworks required to provide requested capabilities.
- Design and develop BI solutions.
- Hands-on development mentality, with a willingness to troubleshoot and solve complex problems.
- Keep abreast of new developments in the big data ecosystem and learn new technologies.
- Ability to effectively work independently and handle multiple priorities.
- Bachelor degree or higher in Computer Science, Computer Engineering, Information Technology, Management Information System or an IT related field.
- 3+ year's experiences in Data Management or Data Engineer (Retail or E-Commerce business is preferrable).
- Expert experience in query language (SQL), Databrick SQL, PostgreSQL.
- Experience in Big Data Technologies like Hadoop, Apache Spark, Databrick.
- Experience in Python is a must.
- Experience in Generative AI is a must.
- Knowledge in machine/statistical learning, data mining is a plus.
- Strong analytical, problem solving, communication and interpersonal skills.
- Having good attitude toward team working and willing to work hard.
- CP AXTRA | Lotus's
- CP AXTRA Public Company Limited.
- Nawamin Office: Buengkum, Bangkok 10230, Thailand.
- By applying for this position, you consent to the collection, use and disclosure of your personal data to us, our recruitment firms and all relevant third parties for the purpose of processing your application for this job position (or any other suitable positions within Lotus's and its subsidiaries, if any). You understand and acknowledge that your personal data will be processed in accordance with the law and our policy. .
Skills:
ETL, Compliance, SSIS, English
Job type:
Full-time
Salary:
negotiable
- Design and build scalable, high-performance data pipelines and ETL processes to support seamless data integration across systems.
- Partner closely with data analysts, data scientists, and business stakeholders to understand their needs and transform them into efficient technical solutions.
- Develop and manage robust data models, ensuring consistent data quality, integrity, and accessibility.
- Optimize data processing pipelines, troubleshoot performance issues, and implement improvements to enhance efficiency.
- Document workflows, system configurations, and processes to maintain compliance and facilitate future enhancements.
- Stay ahead of the curve by keeping up-to-date with the latest advancements in data engineering, cloud platforms, and analytics tools.
- ABOUT YOU.
- Demonstrated expertise as a Data Engineer, with a strong background in designing and maintaining ETL pipelines using tools like Databricks and SSIS.
- Proficient in SQL, with a deep understanding of data warehousing concepts and best practices.
- Familiarity with cloud platforms such as Azure or AWS is highly desirable.
- Strong analytical and problem-solving skills, coupled with an exceptional attention to detail.
- Excellent communication and collaboration abilities, with fluency in English and a proven track record of working effectively with cross-functional teams.
- WHY AMARIS?.
- A global community of talented professionals, fostering innovation and collaboration.
- An empowering environment that prioritizes trust and offers ample opportunities for growth and development.
- Access to world-class training programs and resources to advance your career.
- A workplace culture centered around teamwork, inclusivity, and a commitment to corporate social responsibility.
- Amaris Consulting is proud to be an equal-opportunity workplace. We are committed to promoting diversity within the workforce and creating an inclusive working environment. For this purpose, we welcome applications from all qualified candidates regardless of gender, sexual orientation, race, ethnicity, beliefs, age, marital status, disability, or other characteristics.
Skills:
ETL, Big Data, SQL
Job type:
Full-time
Salary:
negotiable
- Design and develop ETL solutions using data integration tools for interfacing between source application and the Enterprise Data Warehouse.
- Experience with Big Data or data warehouse.
- Analyze & translate functional specifications & change requests into technical specifications.
- Experience in SQL programming in one of these RDBMSs such as Oracle.
- Develops ETL technical specifications, designs, develops, tests, implements, and supports optimal data solutions.
- Develops Documents ETL data mappings, data dictionaries, processes, programs and solutions as per established standards for data governance.
- Design and create codes for all related data extraction, transformation and loading (ETL) into database under responsibilities.
- Creates, executes, and documents unit test plans for ETL and data integration processes and programs.
- Perform problem assessment, resolution and documentation in existing ETL packages, mapping and workflows in production.
- Performance tuning of the ETL process and SQL queries and recommend and implement ETL and query tuning techniques.
- Qualifications Bachelor s degree in Information Technology, Computer Science, Computer Engineering, or a related field.
- Experience with CI/CD tools (e.g., Jenkins, GitLab CI/CD) and automation frameworks.
- Proficiency in cloud platforms such as AWS, Azure, and associated services.
- Knowledge of IAC tools like Terraform or Azure ARM.
- Familiarity with monitoring and logging tools like Prometheus, Grafana, ELK stack, APM, etc.
- Good understanding of IT Operations.
- Strong problem-solving skills and the ability to troubleshoot complex issues.
- Excellent communication and teamwork skills to collaborate effectively across various teams.
- We're committed to bringing passion and customer focus to the business. If you like wild growth and working with happy, enthusiastic over-achievers, you'll enjoy your career with us.
Experience:
3 years required
Skills:
ETL, DevOps, Automation, Big Data, SQL
Job type:
Full-time
Salary:
negotiable
- Data Pipeline Development: Design, implement, and optimize scalable ETL/ELT pipelines to ingest, transform, and store structured and unstructured data in a cloud environment (AWS is a core but not limit).
- Machine Learning Pipeline Development: Work collaboratively with data scientists to productionize and maintain scalable machine learning services. The solutions encompass a variety of approaches, including traditional and near real-time machine learning, deployed across multi-state service architectures.
- Data Platform: Collaborate closely with DevOps and infrastructure teams to design, implement, and manage scalable data storage and processing platforms. Leverage AWS services such as S3, Redshift, Glue, Lambda, Athena, and EMR to ensure performance, reliability, and cost-efficiency.
- Data Modeling and Schema Management: Develop and maintain robust data models and schemas to support analytics, reporting, and operational requirements. Adhere to the design principle of establishing a "single version of truth" to ensure consistency, accuracy, and reliability across all data-driven processes.
- Data/AI Quality-as-a-Service Development: Design, develop, and maintain scalable "Data/AI Quality-as-a-Service" solutions, adhering to zero-ops design principles. The scope of quality includes monitoring data drift, analyzing performance metrics, and detecting model drift to ensure consistent, reliable, and high-performing AI systems.
- Cross-Functional Collaboration: Collaborate closely with data scientists, analysts, and application developers to ensure the seamless integration of data solutions into workflows, enhancing functionality and enabling data-driven decision-making.
- Automation & Monitoring: Design and implement robust monitoring and automation frameworks to ensure the high availability, performance, and cost-efficiency of data workflows, guided by the principle of "Zero Ops by Design.".
- Compliance & Security: Uphold data security, privacy, and compliance with banking regulations and industry standards, ensuring all solutions meet rigorous governance requirements.
- Continuous Improvement: Stay informed about emerging technologies and trends in cloud data engineering, advocating for their adoption to enhance system capabilities and maintain a competitive edge.
- Educational BackgroundBachelor's degree in Computer Science, Computer Engineering, Data Engineering, or a related field.
- Experience3+ years of experience in cloud data engineering or similar roles.
- Proven expertise in cloud data technologies.
- Hands-on experience with big data technologies such as Apache Spark.
- Technical SkillsProficiency in SQL and programming languages such as Python, Java, or Scala.
- Expertise in data pipeline and workflow orchestration tools for both batch and real-time processing (e.g., Apache Airflow, AWS Step Functions).
- Understanding of data warehouse and lakehouse architectures.
- Familiarity with software development best practices, including SDLC concepts, CI/CD/(+CL) pipelines, and Infrastructure as Code tools (e.g., Terraform, AWS CloudFormation).
- Other SkillsStrong problem-solving and analytical thinking capabilities.
- Excellent communication and collaboration skills.
- Preferred QualificationsAWS Data Analytics - Specialty certification or equivalent experience.
- Experience in banking or fintech environments. Understanding of financial data and regulatory requirements.
- Familiarity with real-time data processing and stream analytics.
- Experience in working end-to-end with data scientists and analysts as part of "AnalyticsOps" to develop and maintain ML/AI services is a strong advantage.
- ท่านสามารถอ่านและศึกษานโยบายความเป็นส่วนตัวของธนาคารกรุงไทย จำกัด (มหาชน) ที่ https://krungthai.com/th/content/privacy-policy ทั้งนี้ ธนาคารไม่มีเจตนาหรือความจำเป็นใดๆ ที่จะประมวลผลข้อมูลส่วนบุคคลที่มีความอ่อนไหว รวมถึงข้อมูลที่เกี่ยวข้องศาสนาและ/หรือหมู่โลหิต ซึ่งอาจปรากฏอยู่ในสำเนาบัตรประจำตัวประชาชนของท่านแต่อย่างใด ดังนั้น กรุณาอย่าอัปโหลดเอกสารใดๆ รวมถึงสำเนาบัตรประจำตัวประชาชน หรือกรอกข้อมูลส่วนบุคคลที่มีความอ่อนไหวหรือข้อมูลอื่นใด ซึ่งไม่เกี่ยวข้องหรือไม่จำเป็นสำหรับวัตถุประสงค์ในการสมัครงานไว้บนเว็บไซต์ นอกจากนี้ กรุณาดำเนินการให้แน่ใจว่าได้ดำเนินการลบข้อมูลส่วนบุคคลที่มีความอ่อนไหว (ถ้ามี) ออกจากเรซูเม่และเอกสารอื่นใดก่อนที่จะอัปโหลดเอกสารดังกล่าวไว้บนเว็บไซต์แล้วด้วย ทั้งนี้ ธนาคารมีความจำเป็นต้องเก็บรวบรวมข้อมูลส่วนบุคคลเกี่ยวกับประวัติอาชญากรรมของท่านเพื่อบรรลุวัตถุประสงค์ในการพิจารณารับบุคคลเข้าทำงาน หรือการตรวจสอบคุณสมบัติ ลักษณะต้องห้าม หรือพิจารณาความเหมาะสมของบุคคลที่จะให้ดำรงตำแหน่ง ซึ่งการให้ความยินยอมเพื่อเก็บรวบรวม ใช้ หรือเปิดเผยข้อมูลส่วนบุคคลเกี่ยวกับประวัติอาชญากรรมของท่านมีความจำเป็นสำหรับการเข้าทำสัญญาและการได้รับการพิจารณาตามวัตถุประสงค์ดังกล่าวข้างต้น ในกรณีที่ท่านไม่ให้ความยินยอมในการเก็บรวบรวม ใช้ หรือเปิดเผยข้อมูลส่วนบุคคลเกี่ยวกับประวัติอาชญากรรม หรือมีการถอนความยินยอมในภายหลัง ธนาคารอาจไม่สามารถดำเนินการเพื่อบรรลุวัตถุประสงค์ดังกล่าวข้างต้นได้ และอาจ ทำให้ท่านสูญเสียโอกาสในการได้รับการพิจารณารับเข้าทำงานกับธนาคาร".
Skills:
SQL, Data Warehousing, ETL, English
Job type:
Full-time
Salary:
negotiable
- Work with business domain experts, data scientists and application developers to identify data that is relevant for analysis.
- Retrieve, prepare, and process a rich data variety of data sources.
- Apply data quality, cleaning and semantic inference techniques to maintain high data quality.
- Explore data sources to better understand the availability and quality/integrity of data.
- Gain fluency in AI/ML techniques.
- Experience with relational database systems with expertise in SQL.
- Experience in data management, data warehousing or unstructured data environments.
- Experience with data integration or ETL management tools such as Talend, Apache Airflow, AWS Glue, Google DataFlow or similar.
- Experience programming in Python, Java or other equivalent is a plus.
- Experience with Business Intelligence tools and platforms is a plus, e.g. Tableau, QlikView, Google DataStudio, Google Analytics or similar.
- Experience with Agile methodology and Extreme Programming is a plus.
- Ability to meet critical deadlines and prioritize multiple tasks in a fast-paced environment.
- Ability to work independently, have strong problem solving and organization skills, with a high initiative and a sense of accountability and ownership.
- Good communication in English.
Skills:
Data Analysis, SQL, Problem Solving, English
Job type:
Full-time
Salary:
negotiable
- Working closely with business and technical domain experts to identify data requirements that are relevant for analytics and business intelligence.
- Implement data solutions and data comprehensiveness for data customers.
- Working closely with engineering to ensure data service solutions are ultimately delivered in a timely and cost effective manner.
- Retrieve and prepare data (automated if possible) to support business data analysis.
- Ensure adherence to the highest standards in data privacy protection and data governance.
- Bachelor s of Master s Degree in Computer Science, Computer Engineering, or related.
- Minimum of 1 Year with relational/non-relational database systems and good command in SQL.
- Ability to meet critical deadlines and prioritize multiple tasks in a fast-paced environment.
- Ability to work independently, have strong problem solving and organization skills, with a high initiative and a sense of accountability and ownership.
- Experience with cloud-based platforms such as AWS, Google Cloud platform or similar.
- Experience in data processing batch / real time / nearly realtime.
- Experience with data integration or ETL management tools such as AWS Glue, Databrick,or similar.
- Experience with web or software development with Java,Python or similar.
- Experience with Agile methodology is a plus.
- Good in communication and writing in English.
- Good interpersonal and communication skills.
Experience:
5 years required
Skills:
Python, ETL, Compliance
Job type:
Full-time
Salary:
negotiable
- Design and implement scalable, reliable, and efficient data pipelines for ingesting, processing, and storing large amounts of data from a variety of sources using cloud-based technologies, Python, and PySpark.
- Build and maintain data lakes, data warehouses, and other data storage and processing systems on the cloud.
- Write and maintain ETL/ELT jobs and data integration scripts to ensure smooth and accurate data flow.
- Implement data security and compliance measures to protect data privacy and ensure regulatory compliance.
- Collaborate with data scientists and analysts to understand their data needs and provide them with access to the required data.
- Stay up-to-date on the latest developments in cloud-based data engineering, particularly in the context of Azure, AWS and GCP, and proactively bring new ideas and technologies to the team.
- Monitor and optimize the performance of data pipelines and systems, identifying and resolving any issues or bottlenecks that may arise.
- Bachelor s or Master s degree in Computer Science, Data Science, or a related field.
- Minimum of 5 years of experience as a Data Engineer, with a strong focus on cloud-based data infrastructure.
- Proficient programming skills in Python, Java, or a similar language, with an emphasis on Python.
- Extensive experience with cloud-based data storage and processing technologies, particularly Azure, AWS and GCP.
- Familiarity with ETL/ELT tools and frameworks such as Apache Beam, Apache Spark, or Apache Flink.
- Knowledge of data modeling principles and experience working with SQL databases.
- Strong problem-solving skills and the ability to troubleshoot and resolve issues efficiently.
- Excellent communication and collaboration skills to work effectively with cross-functional teams.
- Location: True Digital Park, Bangkok (Hybrid working).
Experience:
3 years required
Skills:
Quality Assurance, Assurance, ETL, English
Job type:
Full-time
Salary:
negotiable
- Act as a strategic partner providing analytic technology expertise and inputs to business functions resulting in optimal tech investment portfolio and effective digital solution recommendation from DG's expertise to meet business requirements.
- Take lead among stakeholders throughout the organization to understand data needs, identify issues or opportunities for leveraging company data to propose solutions for support decision making to drive business solutions.
- Takes lead to communicate to related parties (with related to analytic areas) i.e., ...
- Take lead in design and layout of information visualization for dashboards, presentations and infographics to analyze complex datasets in a simple and intuitive format in adherence to data visualization best practices.
- Develop, test, debug and integrate data visualization solutions and ensures visualization (including interaction) consistency on analytics projects.
- Take lead in technical quality assurance process to ensure the dashboard design and appropriate data modeling are support to all business requirements.
- Take lead to focuses on the surrounding digital environment i.e., other applications, and responsible to ensure that newly BI / Analytic implemented works seamlessly through proper interfaces to optimize work effectiveness. Also, along the course of implementation has a duty to ensure necessary data are integrated transformed and loaded, and system is brought into production successfully.
- Take lead to effectively deliver and cultivate business value by driving Data driven company by maximum adoption of analytic which aligned to digital transformation master plan.
- Control / manage / govern Level 2 support, identify, fix and configuration related problems regarding BI and visualization.
- Be Project manager for Data project and manager project scope, timeline and budget.
- Manage relationships with stakeholders and coordinate work between different parties as well as providing regular update.
- EXPERIENCE.
- Experience of at least 3 - 4 years in working in BI and analytic area with designing and building BI dashboards.
- Experience of at least 3 - 4 years in fully implemented IT projects (designing, developing and support).
- Experienced in Oil and Gas business would be a strong asset.
- Knowledge in data analytic and compution tools e.g. ETL/ELT tools, and/or data visualization tools, including cloud platform solutions.
- Knowledge in enterprise software and technology such as SAP ECC, SAP BW, Power BI, AWS etc.
- Able to construct complex data models to help visualize and interpret data.
- A continuous learner and challenged by new BI technologies.
- EDUCATION.
- Bachelor s degree in computer science, Computer Engineering, Information Technology, or related discipline.
- Certificate related in data analytics area would be a strong asset.
- OTHER REQUIREMENTS.
- A self starter attitude and eager to further develop new skills.
- Strong written and verbal English skills.
Experience:
5 years required
Skills:
ETL, Quantitative Analysis, Industry trends
Job type:
Full-time
Salary:
negotiable
- Translating business requirements to technical solutions leveraging strong business acumen.
- You will be a core member of the EY Microsoft Data and AI team, responsible for extracting large quantities of data from client s IT systems, developing efficient ETL and data management processes, and building architectures for rapid ingestion and dissemination of key data.
- Apply expertise in quantitative analysis, data mining and presentation of data to de ...
- Extremely flexible and experience managing multiple tasks and priorities on deadlines.
- Applying technical knowledge to architect solutions that meet business and IT needs, create Data Platform roadmaps, and enable the Data Platforms to scale to support additional use cases.
- Staying abreast of current business and industry trends relevant to the client's business.
- Monitoring progress, managing risk, and ensuring key stakeholders are kept informed about progress and expected outcomes.
- Understanding customers overall data estate, IT and business priorities and success measures to design implementation architectures and solutions.
- Strong team collaboration and experience working with remote teams.
- Working on large-scale client engagements. Fostering relationships with client personnel at appropriate levels. Consistently delivering quality client services. Driving high-quality work products within expected timeframes and on budget.
- Demonstrated significant professional experience of commercial, strategy and/or research/analytics interacting with senior stakeholders to effectively communicate insights.
- Execute on building data solutions for business intelligence and assist in effectively managing and monitoring the data ecosystem of analytics, data lakes, warehouses platforms and tools.
- Provide directional guidance and recommendations around data flows including data technology, data integrations, data models, and data storage formats.
- To qualify for the role, you must have.
- Bachelor s degree, or MS degree in Business, Economics, Technology Entrepreneurship, Computer Science, Informatics, Statistics, Applied Mathematics, Data Science, or Machine Learning.
- Minimum of 3-5 years of relevant consulting experience with focus on advanced analytics and business intelligence or similar roles. New graduated are welcome!.
- Communication and critical thinking are essential, must be able to listen and understand the question and develop and deliver clear insights.
- Experience communicating the results of analysis to both technical and non-technical audiences.
- Independent and able to manage and prioritize workload.
- Ability to adapt quickly and positively to change.
- Breadth of technical passion, desire to learn and knowledge services.
- Willingness and ability to travel to meet client if need.
- Ideally, you ll also have.
- Experience working business or IT transformation projects that have supported data science, business intelligence, artificial intelligence, and cloud applications at scale.
- Ability to communicate clearly and succinctly, adjusts to a variety of styles and audiences with ability to tell compelling stories with the data.
- Experience with C#, VBA, JavaScript, R.
- A vast understanding of key BI trends and the BI vendor landscape.
- Working experience with Agile and/or Scrum methods of delivery.
- Working experience with design led thinking.
- Microsoft Certifications in the Data & AI domain.
- We re interested in passionate leaders with strong vision and a desire to deeply understand the trends at the intersection of business and Data and AI. We want a customer-focused professional who is motivated to drive the creation of great enterprise products and who can collaborate and partner with other product teams, and engineers. If you have a genuine passion for helping businesses achieve the full potential of their data, this role is for you.
- What we offer.
- We offer a competitive compensation package where you ll be rewarded based on your performance and recognized for the value you bring to our business. In addition, our Total Rewards package includes medical and dental coverage, pension and 401(k) plans, and a wide range of paid time off options. Under our flexible vacation policy, you ll decide how much vacation time you need based on your own personal circumstances. You ll also be granted time off for designated EY Paid Holidays, Winter/Summer breaks, Personal/Family Care, and other leaves of absence when needed to support your physical, financial, and emotional well-being.
- Continuous learning: You ll develop the mindset and skills to navigate whatever comes next.
- Success as defined by you: We ll provide the tools and flexibility, so you can make a meaningful impact, your way.
- Transformative leadership: We ll give you the insights, coaching and confidence to be the leader the world needs.
- Diverse and inclusive culture: You ll be embraced for who you are and empowered to use your voice to help others find theirs.
- If you can demonstrate that you meet the criteria above, please contact us as soon as possible.
- The exceptional EY experience. It s yours to build.
- EY | Building a better working world.
- EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets.
- Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate.
- Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.
- EY is an equal opportunity, affirmative action employer providing equal employment opportunities to applicants and employees without regard to race, color, religion, age, sex, sexual orientation, gender identity/expression, national origin, protected veteran status, disability status, or any other legally protected basis, in accordance with applicable law.
Skills:
Sales, Salesforce, Java
Job type:
Full-time
Salary:
negotiable
- Work alongside the wider team, lead the overall technology solution, planning and estimation for complex projects.
- Drive innovation and continuous improvement in design and delivery practices of our solutions, including Salesforce best practice in the implementation of client solutions.
- Manage the delivery teams to deliver full Salesforce lifecycle implementations, with a focus on the client success but awareness of other wider business and technology ...
- Act as a role model for the team by always demonstrating the highest standards in business, digital led transformation.
- Conduct quality reviews of our implementation to ensure they meet our high standards.
- Lead end-to-end pre-sales activities.
- Provide leadership and support for delivery teams and staff in local offices.
- Your role as a leaderAt Deloitte, we believe in the importance of empowering our people to be leaders at all levels. We connect our purpose and shared values to identify issues as well as to make an impact that matters to our clients, people and the communities. Additionally, Managers across our Firm are expected to:Develop diverse, high-performing people and teams through new and meaningful development opportunities.
- Collaborate effectively to build productive relationships and networks.
- Understand and lead the execution of key objectives and priorities for internal as well as external stakeholders.
- Align your team to key objectives as well as set clear priorities and direction.
- Make informed decisions that positively impact the sustainable financial performance and enhance the quality of outcomes.
- Influence stakeholders, teams, and individuals positively - leading by example and providing equal opportunities for our people to grow, develop and succeed.
- Lead with integrity and make a strong positive impact by energising others, valuing individual differences, recognising contributions, and inspiring self-belief.
- Deliver superior value and high-quality results to stakeholders while driving high performance from people across Deloitte.
- Apply their understanding of disruptive trends and competitor activity to recommend changes, in line with leading practices.
- Requirements:8+ years CRM experience with a minimum of 4 years on the Salesforce core platform and Salesforce Marketing Cloud.
- At least 4 full life-cycle Salesforce implementation with strong expertise as well as certifications in the following modules: Sales Cloud, Service Cloud, Marketing Cloud, Community Cloud, Force.com, Apttus.
- Development and troubleshooting experience with Salesforce (Apex, Visualforce, Lightning, Java/C#/OOP, Javascript/JQuery, Angular, JS/Bootstrap, SQL/SOQL, Web Services) will be preferred.
- Lead technical design sessions with client s technical team/architects; architect and document technical solutions aligned with client business objectives; identify gaps between client's current and desired end states.
- Strong understanding of Agile / Iterative delivery methodology.
- Knowledge of data integration tools and experience integrating Salesforce with different business systems (ETL, CPQ, marketing automation, reporting, etc.).
- Understanding of systems architecture and ability to design scalable performance-driven solutions.
- Familiarity with platform authentication patterns (SAML, SSO, OAuth).
- Strong understanding of environment management, release management, code versioning best practices, and deployment methodologies.
- Responsible for deliverable for project. Capacity plan for specific plan, managing the development team.
- Ensure utilization of staff is optimized by tracking individual team member forecast.
- Allocating resources and responsibilities across the team to deliver business results and develop team members.
- Responsible for supporting quality programs throughout the entire SDLC period.
- Experience with Wave Analytics, Lightening, Blue Kai, Eloqua, Exact Target or Marketo will be a bonus.
- An appreciation of the consulting lifestyle and ability to travel (both locally and abroad) is a pre-requisite to fit to our short-term and long-term project assignment.
- Due to volume of applications, we regret that only shortlisted candidates will be notified.
- Please note that Deloitte will never reach out to you directly via messaging platforms to offer you employment opportunities or request for money or your personal information. Kindly apply for roles that you are interested in via this official Deloitte website.
- Requisition ID: 107007In Thailand, the services are provided by Deloitte Touche Tohmatsu Jaiyos Co., Ltd. and other related entities in Thailand ("Deloitte in Thailand"), which are affiliates of Deloitte Southeast Asia Ltd. Deloitte Southeast Asia Ltd is a member firm of Deloitte Touche Tohmatsu Limited. Deloitte in Thailand, which is within the Deloitte Network, is the entity that is providing this Website.
Experience:
6 years required
Skills:
Finance, Data Analysis, Excel
Job type:
Full-time
Salary:
negotiable
- Work closely with a diverse set of stakeholders in Finance and other parts of the organization.
- Consult stakeholders to propose the best suited data solution.
- Build end-to-end solutions including data workflows, dashboards, reports etc.
- Become familiar with financial data ecosystem at Agoda.
- Help the Finance Analytics team drive value for the Finance department and Agoda as a whole.
- Undergraduate degree.
- 6+ years of leadership experience in analytics/data science/insights/strategy.
- 3+ years' experience leading analytics, operational, product or other technical teams.
- Expert domain of data analysis and data visualization tools and software such as Excel, SQL, Tableau, Python, R, or similar.
- Experience in ETL tools, data modelling and have proficient knowledge of SQL and Relational Databases: the ability to write, execute and interpret queries is essential.
- Quick learner, problem-solving aptitude, effective prioritization, proactive and strong attention to detail.
- High sense of ownership and growth mindset, ability to be self-directed.
- Ability to understand business questions/requests and be able to suggest proper BI solutions which are measurable and scalable.
- Excellent communication skills and ability to influence peers and build strong relationships within Finance and cross-functionally.
- Experience in articulating strategic issues and negotiating with C-level executives - experience in leading strategy consulting projects a plus.
- People management - track record of developing stars.
- Ability and willingness to drive projects independently, working efficiently to deliver results rapidly and engaging the relevant stakeholders throughout the process.
- Accounting/Financial knowledge and commercial acumen.
- Master's degree in statistics, economics, mathematics or similar discipline.
- Solid technical/functional knowledge in statistics.
- Familiarity with scrum/agile methodology.
- Other helpful skills - T-SQL, batch scripting, ODBC, data mining, Hadoop.
- Experience with Wallet and Financial Payment systems, including integration and data analysis.
- Understanding of digital payment processes and financial transaction data.
- Equal Opportunity Employer.
- At Agoda, we pride ourselves on being a company represented by people of all different backgrounds and orientations. We prioritize attracting diverse talent and cultivating an inclusive environment that encourages collaboration and innovation. Employment at Agoda is based solely on a person's merit and qualifications. We are committed to providing equal employment opportunity regardless of sex, age, race, color, national origin, religion, marital status, pregnancy, sexual orientation, gender identity, disability, citizenship, veteran or military status, and other legally protected characteristics.
- We will keep your application on file so that we can consider you for future vacancies and you can always ask to have your details removed from the file. For more details please read our privacy policy.
- To all recruitment agencies: Agoda does not accept third party resumes. Please do not send resumes to our jobs alias, Agoda employees or any other organization location. Agoda is not responsible for any fees related to unsolicited resumes.
Skills:
Big Data, Java, Python
Job type:
Full-time
Salary:
negotiable
- Background in programming, databases and/or big data technologies OR.
- BS/MS in software engineering, computer science, economics or other engineering fields.
- Partner with Data Architect and Data Integration Engineer to enhance/maintain optimal data pipeline architecture aligned to published standards.
- Assemble medium, complex data sets that meet functional /non-functional business requirements.
- Design and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
- Build the infrastructure required for optimal extraction transformation, and loading of data from a wide variety of data sources big data technologies.
- Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics.
- Work with stakeholders including Domain leads, and Teams to assist with data-related technical issues and support their data infrastructure needs.
- Ensure technology footprint adheres to data security policies and procedures related to encryption, obfuscation and role based access.
- Create data tools for analytics and data scientist team members.
- Functional Competency.
- Knowledge of data and analytics framework supporting data lakes, warehouses, marts, reporting, etc.
- Defining data retention policies, monitoring performance and advising any necessary infrastructure changes based on functional and non-functional requirements.
- In depth knowledge of data engineering discipline.
- Extensive experience working with Big Data tools and building data solutions for advanced analytics.
- Minimum of 5+ years' hands-on experience with a strong data background.
- Solid programming skills in Java, Python and SQL.
- Clear hands-on experience with database systems - Hadoop ecosystem, Cloud technologies (e.g. AWS, Azure, Google), in-memory database systems (e.g. HANA, Hazel cast, etc) and other database systems - traditional RDBMS (e.g. Teradata, SQL Server, Oracle), and NoSQL databases (e.g. Cassandra, MongoDB, DynamoDB).
- Practical knowledge across data extraction and transformation tools - traditional ETL tools (e.g. Informatica, Ab Initio, Altryx) as well as more recent big data tools.
- Educational.
- Background in programming, databases and/or big data technologies OR.
- BS/MS in software engineering, computer science, economics or other engineering fields.
- Partner with Data Architect and Data Integration Engineer to enhance/maintain optimal data pipeline architecture aligned to published standards.
- Assemble medium, complex data sets that meet functional /non-functional business requirements.
- Design and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
- Build the infrastructure required for optimal extraction transformation, and loading of data from a wide variety of data sources big data technologies.
- Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics.
- Work with stakeholders including Domain leads, and Teams to assist with data-related technical issues and support their data infrastructure needs.
- Ensure technology footprint adheres to data security policies and procedures related to encryption, obfuscation and role based access.
- Create data tools for analytics and data scientist team members.
- Functional Competency.
- Knowledge of data and analytics framework supporting data lakes, warehouses, marts, reporting, etc.
- Defining data retention policies, monitoring performance and advising any necessary infrastructure changes based on functional and non-functional requirements.
- In depth knowledge of data engineering discipline.
- Extensive experience working with Big Data tools and building data solutions for advanced analytics.
- Minimum of 5+ years' hands-on experience with a strong data background.
- Solid programming skills in Java, Python and SQL.
- Clear hands-on experience with database systems - Hadoop ecosystem, Cloud technologies (e.g. AWS, Azure, Google), in-memory database systems (e.g. HANA, Hazel cast, etc) and other database systems - traditional RDBMS (e.g. Teradata, SQL Server, Oracle), and NoSQL databases (e.g. Cassandra, MongoDB, DynamoDB).
- Practical knowledge across data extraction and transformation tools - traditional ETL tools (e.g. Informatica, Ab Initio, Altryx) as well as more recent big data tools.
Skills:
Business Development, Creative Thinking, Project Management, English
Job type:
Full-time
Salary:
negotiable
- Strong analytical and problem-solving skills to identify commercial opportunity.
- Working well in a cross-disciplinary team with different types of stakeholders (IT, Agency, Business, Management).
- Business Development of Data Intelligence for corporate strategy.
- Analyze internal and external data in various aspects to identify threats & opportunities and provide information/report for management or related business unit team to plan activities and strategies.
- Participate in the initiative's development plan of business unit / brand plans and align with corporate strategy, objectives and KPIs.
- Coordinate and consolidate with the related department to implement a project or tracking the project progress and provide corrective supervision if necessary.
- Create and deliver insights report on new ideas to the management team or business units and seek appropriate decisions, directions, and approvals.
- Bachelor s or Master s degree in business or related field of study.
- Minimum 5-8 years Performance Management function / Commercial Analyst roles.
- Experience in corporate/channel/brand/segment strategy.
- Experience working in Data Analytic related projects.
- Excellent analytical and problem-solving skills.
- Ability to apply logical and creative thinking to solve complex business problems.
- Ability to define the problems and frame answers in a logical and structured manner.
- Good project management, team leadership and sense of ownership.
- Good co-ordination skill with positive attitude and ability to work under pressure.
- Strong communications, customer relationship and negotiation skills.
- Good command of both written and spoken English.
- TECHNICAL SKILLS: Basic understanding of data ecosystem, Advanced skills in dashboard and BI tools.
- Conceptual knowledge of data and analytics, ETL, reporting tools, data governance, data warehousing, structured and unstructured data.
Skills:
Data Analysis, Python, SQL
Job type:
Full-time
Salary:
negotiable
- Data Collection and Cleaning: Gather data from various sources and prepare it for analysis by cleaning and checking for completeness or errors.
- Data Analysis: Utilize statistical techniques to study, analyze, discover insights, and derive meaningful data-driven conclusions.
- Reporting: Create reports in various formats, presenting data through graphs, features, or figures for stakeholders' use.
- Collaboration: Work with other teams to plan data analysis or resolve specific issues affecting organizational goals.
- Performance Monitoring: Define and track key performance indicators (KPIs) to measure the effectiveness and efficiency of various projects.
- Strategic Planning: Assist the management team in strategic planning by using data analysis to improve workflows and processes.
- Bachelor's or master's degree in computer science, information systems, or a related field.
- Strong programming skills in languages such as PowerBI, R, Python, SQL. C++.
- At least 5 year experience with data modeling, database design, and data warehousing concepts.
- Proficiency in working with relational databases (e.g., MySQL, PostgreSQL).
- Knowledge of ETL tools and techniques for data integration and transformation is a plus.
- Location: BTS Ekkamai
- Working Day: Mon-Fri.
Experience:
2 years required
Skills:
Automation, Finance, SQL, DevOps, Python
Job type:
Full-time
Salary:
negotiable
- Automation of Analytics PipelinesDevelop and Maintain Automated Data Pipelines: Build and maintain robust data pipelines for reporting and analytics using cloud-native technologies such as AWS Glue, Redshift, and Lambda.
- Streamline Automation Frameworks: Ensure the high availability, performance, and cost-efficiency of data workflows by adhering to Zero Ops by Design principles, ensuring that data pipelines run seamlessly with minimal manual intervention.
- Timely, Accurate Reporting: Automate reporting processes to ensure consistent, accur ...
- Advanced Reporting & AnalyticsReporting Systems Design & Optimization: Design, implement, and optimize reporting systems that deliver actionable insights to key business stakeholders.
- BI Tools & Dashboards: Use visualization tools such as Tableau, Grafana, and AWS Quick Sight to create dynamic, self-service dashboards and reports, empowering teams to make data-driven decisions.
- Data Modeling and Schema ManagementDevelop Robust Data Models & Schemas: Design and maintain data models and schemas that support analytics, reporting, and operational needs.
- Single Version of Truth: Ensure consistency, accuracy, and reliability by establishing a "single version of truth," providing a consistent data framework across the organization.
- Quality-as-a-Service DevelopmentBuild Scalable Quality Solutions: Design and maintain "Data/AI Quality-as-a-Service" solutions to monitor data drift, analyze performance metrics, and detect data issues early in the process.
- Zero-Ops Design for Quality Monitoring: Ensure the high availability and performance of quality solutions while aligning with zero-ops design principles, minimizing operational overhead.
- Cross-Functional CollaborationCollaborate with Teams: Work closely with data scientists, analysts, and application developers to integrate data solutions seamlessly into their workflows, enabling advanced analytics and enhancing decision-making capabilities.
- Compliance & SecurityEnsure Data Security & Compliance: Uphold data security and privacy standards while ensuring all solutions comply with banking regulations and industry governance requirements.
- Governance Standards: Maintain rigorous governance practices for data access, privacy, and security across all automated reporting systems.
- Continuous ImprovementTechnology Advocacy: Stay informed about emerging trends in cloud data engineering, automation, and analytics. Advocate for the adoption of new technologies that can enhance system capabilities and maintain a competitive edge.
- Drive Continuous Improvement: Continuously refine processes and solutions to ensure they remain optimized for both performance and cost..
- Essential Skills & ExperienceBachelor's degree in Computer Science, Engineering, Business Information System or a related field.
- 2+ years of experience in data engineering, automation, or analytics engineering, focusing on reporting and business intelligence in the financial or banking sector.
- Expertise in cloud platforms (AWS preferred) and technologies such as AWSGlue, Redshift, Lambda, and S3.
- Experience with BI/Visualization tools (e.g., Tableau, Grafana, AWS QuickSight).
- Strong understanding of data modeling principles, ETL/ELT processes, and creating data schemas for reporting and analytics.
- Proficiency in SQL, Python, or other relevant programming languages.
- Familiarity with "Zero Ops by Design" principles and automation frameworks..
- Preferred SkillsKnowledge of financial regulations and their impact on data governance and reporting in the banking sector.
- Experience in building and maintaining "Data/AI Quality-as-a-Service" solutions for monitoring and ensuring data quality.
- Familiarity with DevOps practices and CI/CD pipelines for analytic engineering solutions.
- Experience in setting up and maintaining high-performing, scalable reporting systems.
- Understanding of advanced analytics and machine learning concepts.
- ท่านสามารถอ่านและศึกษานโยบายความเป็นส่วนตัวของธนาคารกรุงไทย จำกัด (มหาชน) ที่ https://krungthai.com/th/content/privacy-policy ทั้งนี้ ธนาคารไม่มีเจตนาหรือความจำเป็นใดๆ ที่จะประมวลผลข้อมูลส่วนบุคคลที่มีความอ่อนไหว รวมถึงข้อมูลที่เกี่ยวข้องศาสนาและ/หรือหมู่โลหิต ซึ่งอาจปรากฏอยู่ในสำเนาบัตรประจำตัวประชาชนของท่านแต่อย่างใด ดังนั้น กรุณาอย่าอัปโหลดเอกสารใดๆ รวมถึงสำเนาบัตรประจำตัวประชาชน หรือกรอกข้อมูลส่วนบุคคลที่มีความอ่อนไหวหรือข้อมูลอื่นใด ซึ่งไม่เกี่ยวข้องหรือไม่จำเป็นสำหรับวัตถุประสงค์ในการสมัครงานไว้บนเว็บไซต์ นอกจากนี้ กรุณาดำเนินการให้แน่ใจว่าได้ดำเนินการลบข้อมูลส่วนบุคคลที่มีความอ่อนไหว (ถ้ามี) ออกจากเรซูเม่และเอกสารอื่นใดก่อนที่จะอัปโหลดเอกสารดังกล่าวไว้บนเว็บไซต์แล้วด้วย ทั้งนี้ ธนาคารมีความจำเป็นต้องเก็บรวบรวมข้อมูลส่วนบุคคลเกี่ยวกับประวัติอาชญากรรมของท่านเพื่อบรรลุวัตถุประสงค์ในการพิจารณารับบุคคลเข้าทำงาน หรือการตรวจสอบคุณสมบัติ ลักษณะต้องห้าม หรือพิจารณาความเหมาะสมของบุคคลที่จะให้ดำรงตำแหน่ง ซึ่งการให้ความยินยอมเพื่อเก็บรวบรวม ใช้ หรือเปิดเผยข้อมูลส่วนบุคคลเกี่ยวกับประวัติอาชญากรรมของท่านมีความจำเป็นสำหรับการเข้าทำสัญญาและการได้รับการพิจารณาตามวัตถุประสงค์ดังกล่าวข้างต้น ในกรณีที่ท่านไม่ให้ความยินยอมในการเก็บรวบรวม ใช้ หรือเปิดเผยข้อมูลส่วนบุคคลเกี่ยวกับประวัติอาชญากรรม หรือมีการถอนความยินยอมในภายหลัง ธนาคารอาจไม่สามารถดำเนินการเพื่อบรรลุวัตถุประสงค์ดังกล่าวข้างต้นได้ และอาจ ทำให้ท่านสูญเสียโอกาสในการได้รับการพิจารณารับเข้าทำงานกับธนาคาร".
- 1