- No elements found. Consider changing the search query.
ทักษะ:
SQL, Data Warehousing, ETL, English
ประเภทงาน:
งานประจำ
เงินเดือน:
สามารถต่อรองได้
- Work with business domain experts, data scientists and application developers to identify data that is relevant for analysis.
- Retrieve, prepare, and process a rich data variety of data sources.
- Apply data quality, cleaning and semantic inference techniques to maintain high data quality.
- Explore data sources to better understand the availability and quality/integrity of data.
- Gain fluency in AI/ML techniques.
- Experience with relational database systems with expertise in SQL.
- Experience in data management, data warehousing or unstructured data environments.
- Experience with data integration or ETL management tools such as Talend, Apache Airflow, AWS Glue, Google DataFlow or similar.
- Experience programming in Python, Java or other equivalent is a plus.
- Experience with Business Intelligence tools and platforms is a plus, e.g. Tableau, QlikView, Google DataStudio, Google Analytics or similar.
- Experience with Agile methodology and Extreme Programming is a plus.
- Ability to meet critical deadlines and prioritize multiple tasks in a fast-paced environment.
- Ability to work independently, have strong problem solving and organization skills, with a high initiative and a sense of accountability and ownership.
- Good communication in English.
ทักษะ:
ETL, Power BI, Tableau
ประเภทงาน:
งานประจำ
เงินเดือน:
สามารถต่อรองได้
- ปริญญาตรี สาขาคอมพิวเตอร์หรือสาขาที่เกี่ยวข้อง.
- มีประสบการณ์ในงานบริการด้านสารสนเทศอย่างน้อย 3-5 ปี.
- มีประสบการณ์ทางด้านพัฒนาระบบ Business Intelligence อย่างน้อย 2 ปี.
- ความรู้ความเข้าใจเกี่ยวกับ data structure, database and data warehouse ( เช่น structure data, unstructured data, data management).
- มีทักษะเกี่ยวกับ ETL techniques สำหรับการพัฒนา data ingestion, data blending, and data integration.
- มีความรู้เกี่ยวกับเครื่องมือที่ใช้ในการทำ BI Report หรือ Dashboard เช่น Power BI, Tableau, Business Objects เป็นต้น.
- สำนักทรัพยากรบุคคลบริษัท ช้างอินเตอร์เนชั่นแนล จำกัด (สำนักงานใหญ่)อาคารไทยเบฟควอเตอร์ ชั้น 8,9 ถนนรัชดาภิเษก แขวงคลองเตย เขตคลองเตย กรุงเทพมหานคร 10110.
ทักษะ:
ETL, Python
ประเภทงาน:
งานประจำ
เงินเดือน:
สามารถต่อรองได้
- รวมข้อมูลดิบจากแหล่งข้อมูลต่างๆ ทั้งภายและภายนอก และสร้างอัลกอริทึมและสร้างต้นแบบ.
- การรวบรวมข้อมูลที่มีโครงสร้าง (Structured Data) และไม่มีโครงสร้าง (Unstructured Data) ประเมินความต้องการและวัตถุประสงค์ทางธุรกิจ.
- จัดระเบียบข้อมูลดิบ นำข้อมูลมาทำความสะอาด (Data Cleansing + Shaping + ETL) รวมถึงจัดเตรียมข้อมูลสำหรับการสร้างแบบจำลอง.
- ทำการ Explore Data ด้วย Data Visualization Tools.
- หาวิธีการปรับปรุงคุณภาพของข้อมูล และสร้างความน่าเชื่อถือของข้อมูลดูแนวโน้มและ Patterns.
- ใช้ AI/ML การทำนาย คาดการณ์ (Predictive) และรายงานผล.
- ปริญญาตรี สาขา วิทยาการคอมพิวเตอร์ วิศวกรรมศาสตร์ คณิตศาสตร์ สถิติ หรือสาขาอื่นที่เกี่ยวข้อง.
- มีประสบการณ์ในการทำ Data Visualization.
- มีประสบการณ์การใช้ Cloud Platform.
- มีประสบการณ์การใช้ Python เช่น Pandas, Numpy, Scikit-learn, Matplotlib and PyTorch.
- มีประสบการณ์ การใช้ Machine Learning เช่น regression, classification, clustering and association.
- มีความเข้าใจในสถิติข้อมูล.
- มีทักษะการนำเสนอ โดยสามารถนำความคิดที่ซับซ้อนมาทำให้เป็นรูปแบบที่เข้าใจง่าย.
- นักสื่อสารที่ดีพร้อมการจัดการผู้มีส่วนได้ส่วนเสียได้.
- มีทักษะการแก้ปัญหา.
ทักษะ:
ETL, Data Analysis, Industry trends
ประเภทงาน:
งานประจำ
เงินเดือน:
฿70,000 - ฿90,000, สามารถต่อรองได้
- Analyze and interpret complex data sets to uncover insights and trends that drive business strategy and decision-making.
- Collaborate with cross-functional teams to understand their data needs and provide actionable recommendations.
- Design and maintain dashboards, reports, and visualizations using tools to communicate insights effectively.
- Extract data from various sources, including databases, APIs, and third-party services, ensuring data quality and accuracy.
- Develop and implement data models, ETL processes, and automated reporting solutions to streamline data analysis.
- Stay updated with industry trends and new technologies to enhance the company's data analytics capabilities.
- Participate in data governance initiatives, ensuring compliance with data privacy and security regulations.
- Requirements/Qualifications(must have):.
- Bachelor's degree in Statistics, Data Science, or a related field; an MBA or advanced degree is a plus.
- Minimum of 5 years of experience in business intelligence or data analysis, preferably in a fast-paced e-commerce environment.
- Proficient in SQL and at least one data visualization tool (e.g., Tableau, Power BI), with a solid understanding of data warehousing concepts.
- Strong analytical skills, with the ability to manipulate, clean, and derive insights from large datasets.
- Effective communicator with excellent presentation skills, capable of translating complex data into simple, actionable insights for non-technical stakeholders.
ทักษะ:
ETL, SQL, Hadoop
ประเภทงาน:
งานประจำ
เงินเดือน:
สามารถต่อรองได้
- Conduct meeting with users to understand the data requirements and perform database design based on data understanding and requirements with consideration for performance.
- Maintain data dictionary, relationship and its interpretation.
- Analyze problem and find resolution, as well as work closely with administrators to monitor performance and advise any necessary infrastructure changes.
- Work with business domain experts, data scientists and application developers to identify data that is relevant for analysis.
- Develop big data solutions for batch processing and near real-time streaming.
- Own end-to-end data ETL/ELT process framework from Data Source to Data warehouse.
- Select and integrate appropriate tools and frameworks required to provide requested capabilities.
- Design and develop BI solutions.
- Hands-on development mentality, with a willingness to troubleshoot and solve complex problems.
- Keep abreast of new developments in the big data ecosystem and learn new technologies.
- Ability to effectively work independently and handle multiple priorities.
- Master degree or higher in Computer Science, Computer Engineering, Information Technology, Management Information System or an IT related field.
- 3+ year's experience in Data Management or Data Engineer (Retail or E-Commerce business is preferable).
- Expert experience in query language (SQL), Databrick SQL, PostgreSQL.
- Experience in BigData Technologies like Hadoop, Apache Spark, Databrick.
- Experience in Python is a must.
- Knowledge in machine/statistical learning, data mining is a plus.
- Experience in Generative AI is a plus.
- Strong analytical, problem solving, communication and interpersonal skills.
- Having good attitude toward team working and willing to work hard.
- CP AXTRA | Lotus's
- CP AXTRA Public Company Limited.
- Nawamin Office: Buengkum, Bangkok 10230, Thailand.
- By applying for this position, you consent to the collection, use and disclosure of your personal data to us, our recruitment firms and all relevant third parties for the purpose of processing your application for this job position (or any other suitable positions within Lotus's and its subsidiaries, if any). You understand and acknowledge that your personal data will be processed in accordance with the law and our policy. .
ทักษะ:
Big Data, ETL, SQL
ประเภทงาน:
งานประจำ
เงินเดือน:
สามารถต่อรองได้
- Develop and maintain robust data pipelines to ingest, process, and transform raw data into formats suitable for LLM training.
- Conduct meeting with users to understand the data requirements and perform database design based on data understanding and requirements with consideration for performance.
- Maintain data dictionary, relationship and its interpretation.
- Analyze problem and find resolution, as well as work closely with administrators to monitor performance and advise any necessary infrastructure changes.
- Work with business domain experts, data scientists and application developers to identify data that is relevant for analysis.
- Develop big data solutions for batch processing and near real-time streaming.
- Own end-to-end data ETL/ELT process framework from Data Source to Data warehouse.
- Select and integrate appropriate tools and frameworks required to provide requested capabilities.
- Design and develop BI solutions.
- Hands-on development mentality, with a willingness to troubleshoot and solve complex problems.
- Keep abreast of new developments in the big data ecosystem and learn new technologies.
- Ability to effectively work independently and handle multiple priorities.
- Bachelor degree or higher in Computer Science, Computer Engineering, Information Technology, Management Information System or an IT related field.
- 3+ year's experiences in Data Management or Data Engineer (Retail or E-Commerce business is preferrable).
- Expert experience in query language (SQL), Databrick SQL, PostgreSQL.
- Experience in Big Data Technologies like Hadoop, Apache Spark, Databrick.
- Experience in Python is a must.
- Experience in Generative AI is a must.
- Knowledge in machine/statistical learning, data mining is a plus.
- Strong analytical, problem solving, communication and interpersonal skills.
- Having good attitude toward team working and willing to work hard.
- CP AXTRA | Lotus's
- CP AXTRA Public Company Limited.
- Nawamin Office: Buengkum, Bangkok 10230, Thailand.
- By applying for this position, you consent to the collection, use and disclosure of your personal data to us, our recruitment firms and all relevant third parties for the purpose of processing your application for this job position (or any other suitable positions within Lotus's and its subsidiaries, if any). You understand and acknowledge that your personal data will be processed in accordance with the law and our policy. .
ประสบการณ์:
5 ปีขึ้นไป
ทักษะ:
Python, ETL, Compliance
ประเภทงาน:
งานประจำ
เงินเดือน:
สามารถต่อรองได้
- Design and implement scalable, reliable, and efficient data pipelines for ingesting, processing, and storing large amounts of data from a variety of sources using cloud-based technologies, Python, and PySpark.
- Build and maintain data lakes, data warehouses, and other data storage and processing systems on the cloud.
- Write and maintain ETL/ELT jobs and data integration scripts to ensure smooth and accurate data flow.
- Implement data security and compliance measures to protect data privacy and ensure regulatory compliance.
- Collaborate with data scientists and analysts to understand their data needs and provide them with access to the required data.
- Stay up-to-date on the latest developments in cloud-based data engineering, particularly in the context of Azure, AWS and GCP, and proactively bring new ideas and technologies to the team.
- Monitor and optimize the performance of data pipelines and systems, identifying and resolving any issues or bottlenecks that may arise.
- Bachelor s or Master s degree in Computer Science, Data Science, or a related field.
- Minimum of 5 years of experience as a Data Engineer, with a strong focus on cloud-based data infrastructure.
- Proficient programming skills in Python, Java, or a similar language, with an emphasis on Python.
- Extensive experience with cloud-based data storage and processing technologies, particularly Azure, AWS and GCP.
- Familiarity with ETL/ELT tools and frameworks such as Apache Beam, Apache Spark, or Apache Flink.
- Knowledge of data modeling principles and experience working with SQL databases.
- Strong problem-solving skills and the ability to troubleshoot and resolve issues efficiently.
- Excellent communication and collaboration skills to work effectively with cross-functional teams.
- Location: True Digital Park, Bangkok (Hybrid working).
ประสบการณ์:
3 ปีขึ้นไป
ทักษะ:
Quality Assurance, Assurance, ETL, English
ประเภทงาน:
งานประจำ
เงินเดือน:
สามารถต่อรองได้
- Act as a strategic partner providing analytic technology expertise and inputs to business functions resulting in optimal tech investment portfolio and effective digital solution recommendation from DG's expertise to meet business requirements.
- Take lead among stakeholders throughout the organization to understand data needs, identify issues or opportunities for leveraging company data to propose solutions for support decision making to drive business solutions.
- Takes lead to communicate to related parties (with related to analytic areas) i.e., ...
- Take lead in design and layout of information visualization for dashboards, presentations and infographics to analyze complex datasets in a simple and intuitive format in adherence to data visualization best practices.
- Develop, test, debug and integrate data visualization solutions and ensures visualization (including interaction) consistency on analytics projects.
- Take lead in technical quality assurance process to ensure the dashboard design and appropriate data modeling are support to all business requirements.
- Take lead to focuses on the surrounding digital environment i.e., other applications, and responsible to ensure that newly BI / Analytic implemented works seamlessly through proper interfaces to optimize work effectiveness. Also, along the course of implementation has a duty to ensure necessary data are integrated transformed and loaded, and system is brought into production successfully.
- Take lead to effectively deliver and cultivate business value by driving Data driven company by maximum adoption of analytic which aligned to digital transformation master plan.
- Control / manage / govern Level 2 support, identify, fix and configuration related problems regarding BI and visualization.
- Be Project manager for Data project and manager project scope, timeline and budget.
- Manage relationships with stakeholders and coordinate work between different parties as well as providing regular update.
- EXPERIENCE.
- Experience of at least 3 - 4 years in working in BI and analytic area with designing and building BI dashboards.
- Experience of at least 3 - 4 years in fully implemented IT projects (designing, developing and support).
- Experienced in Oil and Gas business would be a strong asset.
- Knowledge in data analytic and compution tools e.g. ETL/ELT tools, and/or data visualization tools, including cloud platform solutions.
- Knowledge in enterprise software and technology such as SAP ECC, SAP BW, Power BI, AWS etc.
- Able to construct complex data models to help visualize and interpret data.
- A continuous learner and challenged by new BI technologies.
- EDUCATION.
- Bachelor s degree in computer science, Computer Engineering, Information Technology, or related discipline.
- Certificate related in data analytics area would be a strong asset.
- OTHER REQUIREMENTS.
- A self starter attitude and eager to further develop new skills.
- Strong written and verbal English skills.
ทักษะ:
ETL, Compliance, SQL
ประเภทงาน:
งานประจำ
เงินเดือน:
สามารถต่อรองได้
- Design, implement, and manage end-to-end data pipelines architectures.
- Configure and maintain data ingest workflows (ETL) across several production systems.
- Transform data into Data Mart, Data Model that can be easily analyzed.
- Ensure the data is correct and is in a highly usable state by the time and good performance.
- Hands-on development mentality, with a willingness to troubleshoot and solve complex problems.
- Ensure compliance with data governance and security policies.
- Bachelor s Degree or higher in Computer Science, Information Technology, Computer Engineering, or related field.
- Minimum 3 years of work experience in Data engineer.
- Strong SQL Command and knowledge on Non-SQL tools and languages.
- Experience in ETL tools & Data Model such as SSIS, SSAS.
- Experience working on Big Data platform is advantage.
- Experience in cloud platform such as AWS, GCP.
- Ability to develop Python, R is advantage.
- Good business understanding, able to identify business problems, formulate business goals, and find relevant data.
- ประสบการณ์ที่จำเป็น.
- 3 ปี.
- ระดับตำแหน่งงาน.
- ระดับหัวหน้างาน.
- เงินเดือน.
- สามารถต่อรองได้.
- สายงาน.
- ไอที / เขียนโปรแกรม.
- วิศวกรรม.
- ประเภทงาน.
- งานประจำ.
- เกี่ยวกับบริษัทจำนวนพนักงาน:500-1000 คน.
- ประเภทบริษัท:อุตสาหกรรมสินค้าอุปโภคบริโภค.
- ที่ตั้งบริษัท:กรุงเทพ.
- เว็บไซต์:www.osotspa.com.
- ก่อตั้งเมื่อปี:1891.
- คะแนน:4.5/5.
- เราจะเป็นบริษัทอุปโภค-บริโภคชั้นนำของประเทศไทยที่สอดคล้องกับไลฟ์สไตล์ของผู้บริโภค และเป็นที่ยอมรับอย่างกว้างขวางในภูมิภาคอาเซียน มากกว่าหนึ่งศตวรรษกับความสำเร็จที่น่าภาคภูมิใจของโอสถสภา เรายังคงมุ่งมั่นวิจัยและคิดค้นพัฒนาผลิตภัณฑ์เพื่อตอบสนองความต้องการ ของคนไทย เพื่อให้คนไทยมีคุณภาพชีวิตที่ดีขึ้น เราพัฒนาศักยภาพเพื่อสร้างความเชื่อมั่นและคุณภาพชีวิตให้กับผู้บริโภคชาวไทยอย่างไม่หยุดยั้ง โอสถสภา ในวันนี้ เราพูดได้อย่างภูมิใจว่า เรามีการดำเนินธุรกิจอย่างครบวงจร นับแต่การวิจัยถึงความต้องการของผู้บริโภค สู่การคิดค้นและพัฒนานวัตกรรมใหม่ๆ สู่กระบวนการผลิต การทำตลาดการกระจายสินค้าและ การจัดกิจกรรมส่งเสริมการตลาดและการขายอย่างต่อเนื่อง เพื่อให้มั่นใจว่า สินค้าของโอสถสภาเข้าเป็นส่วนหนึ่งของวิถีชีวิต และไลฟ์สไตล์ของผู้บริโภคอย่างแท้จริง.
- ร่วมงานกับเรา: ปัจจุบัน โอสถสภามีอายุมากกว่า 123 ปี เรายังคงมุ่งมั่นพัฒนาผลิตภัณฑ์ที่มีคุณภาพต่อไปอย่างไม่หยุดยั้ง ภาใต้ปรัชญาข้อปฏิบัติที่ยึดมั่น "เห็นแก่ประโยชน์ของผู้อื่นมากกว่าตนเอง คิดถึงน้ำใจของคนอื่นมากกว่าเงินตรา มีความสัตย์ซื่อในการประกอบอาชีพ รักษาไว้ซึ่งจรรยาบรรณของธุรกิจ".
- เขตที่ตั้งที่ทำงาน: บางกะปิ.
- สำนักงานใหญ่: 348, Ramkhamhaeng Road.
ประสบการณ์:
5 ปีขึ้นไป
ทักษะ:
ETL, Quantitative Analysis, Industry trends
ประเภทงาน:
งานประจำ
เงินเดือน:
สามารถต่อรองได้
- Translating business requirements to technical solutions leveraging strong business acumen.
- You will be a core member of the EY Microsoft Data and AI team, responsible for extracting large quantities of data from client s IT systems, developing efficient ETL and data management processes, and building architectures for rapid ingestion and dissemination of key data.
- Apply expertise in quantitative analysis, data mining and presentation of data to de ...
- Extremely flexible and experience managing multiple tasks and priorities on deadlines.
- Applying technical knowledge to architect solutions that meet business and IT needs, create Data Platform roadmaps, and enable the Data Platforms to scale to support additional use cases.
- Staying abreast of current business and industry trends relevant to the client's business.
- Monitoring progress, managing risk, and ensuring key stakeholders are kept informed about progress and expected outcomes.
- Understanding customers overall data estate, IT and business priorities and success measures to design implementation architectures and solutions.
- Strong team collaboration and experience working with remote teams.
- Working on large-scale client engagements. Fostering relationships with client personnel at appropriate levels. Consistently delivering quality client services. Driving high-quality work products within expected timeframes and on budget.
- Demonstrated significant professional experience of commercial, strategy and/or research/analytics interacting with senior stakeholders to effectively communicate insights.
- Execute on building data solutions for business intelligence and assist in effectively managing and monitoring the data ecosystem of analytics, data lakes, warehouses platforms and tools.
- Provide directional guidance and recommendations around data flows including data technology, data integrations, data models, and data storage formats.
- To qualify for the role, you must have.
- Bachelor s degree, or MS degree in Business, Economics, Technology Entrepreneurship, Computer Science, Informatics, Statistics, Applied Mathematics, Data Science, or Machine Learning.
- Minimum of 3-5 years of relevant consulting experience with focus on advanced analytics and business intelligence or similar roles. New graduated are welcome!.
- Communication and critical thinking are essential, must be able to listen and understand the question and develop and deliver clear insights.
- Experience communicating the results of analysis to both technical and non-technical audiences.
- Independent and able to manage and prioritize workload.
- Ability to adapt quickly and positively to change.
- Breadth of technical passion, desire to learn and knowledge services.
- Willingness and ability to travel to meet client if need.
- Ideally, you ll also have.
- Experience working business or IT transformation projects that have supported data science, business intelligence, artificial intelligence, and cloud applications at scale.
- Ability to communicate clearly and succinctly, adjusts to a variety of styles and audiences with ability to tell compelling stories with the data.
- Experience with C#, VBA, JavaScript, R.
- A vast understanding of key BI trends and the BI vendor landscape.
- Working experience with Agile and/or Scrum methods of delivery.
- Working experience with design led thinking.
- Microsoft Certifications in the Data & AI domain.
- We re interested in passionate leaders with strong vision and a desire to deeply understand the trends at the intersection of business and Data and AI. We want a customer-focused professional who is motivated to drive the creation of great enterprise products and who can collaborate and partner with other product teams, and engineers. If you have a genuine passion for helping businesses achieve the full potential of their data, this role is for you.
- What we offer.
- We offer a competitive compensation package where you ll be rewarded based on your performance and recognized for the value you bring to our business. In addition, our Total Rewards package includes medical and dental coverage, pension and 401(k) plans, and a wide range of paid time off options. Under our flexible vacation policy, you ll decide how much vacation time you need based on your own personal circumstances. You ll also be granted time off for designated EY Paid Holidays, Winter/Summer breaks, Personal/Family Care, and other leaves of absence when needed to support your physical, financial, and emotional well-being.
- Continuous learning: You ll develop the mindset and skills to navigate whatever comes next.
- Success as defined by you: We ll provide the tools and flexibility, so you can make a meaningful impact, your way.
- Transformative leadership: We ll give you the insights, coaching and confidence to be the leader the world needs.
- Diverse and inclusive culture: You ll be embraced for who you are and empowered to use your voice to help others find theirs.
- If you can demonstrate that you meet the criteria above, please contact us as soon as possible.
- The exceptional EY experience. It s yours to build.
- EY | Building a better working world.
- EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets.
- Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate.
- Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.
- EY is an equal opportunity, affirmative action employer providing equal employment opportunities to applicants and employees without regard to race, color, religion, age, sex, sexual orientation, gender identity/expression, national origin, protected veteran status, disability status, or any other legally protected basis, in accordance with applicable law.
ทักษะ:
Data Analysis, SQL, Problem Solving, English
ประเภทงาน:
งานประจำ
เงินเดือน:
สามารถต่อรองได้
- Working closely with business and technical domain experts to identify data requirements that are relevant for analytics and business intelligence.
- Implement data solutions and data comprehensiveness for data customers.
- Working closely with engineering to ensure data service solutions are ultimately delivered in a timely and cost effective manner.
- Retrieve and prepare data (automated if possible) to support business data analysis.
- Ensure adherence to the highest standards in data privacy protection and data governance.
- Bachelor s of Master s Degree in Computer Science, Computer Engineering, or related.
- Minimum of 1 Year with relational/non-relational database systems and good command in SQL.
- Ability to meet critical deadlines and prioritize multiple tasks in a fast-paced environment.
- Ability to work independently, have strong problem solving and organization skills, with a high initiative and a sense of accountability and ownership.
- Experience with cloud-based platforms such as AWS, Google Cloud platform or similar.
- Experience in data processing batch / real time / nearly realtime.
- Experience with data integration or ETL management tools such as AWS Glue, Databrick,or similar.
- Experience with web or software development with Java,Python or similar.
- Experience with Agile methodology is a plus.
- Good in communication and writing in English.
- Good interpersonal and communication skills.
ทักษะ:
ETL, Python, Java
ประเภทงาน:
งานประจำ
เงินเดือน:
สามารถต่อรองได้
- Design, develop, and maintain scalable data pipelines and ETL processes.
- Implement and optimize data storage solutions, including data warehouses and data lakes.
- Collaborate with data scientists and analysts to understand data requirements and provide efficient data access.
- Ensure data quality, consistency, and reliability across all data systems.
- Develop and maintain data models and schemas.
- Implement data security and access control measures.
- Optimize query performance and data retrieval processes.
- Evaluate and integrate new data technologies and tools.
- Mentor junior data engineers and provide technical leadership.
- Collaborate with cross-functional teams to support data-driven decision-making.
- RequirementsBachelor's or Master's degree in Computer Science, Engineering, or a related field.
- 5+ years of experience in data engineering or related roles.
- Strong programming skills in Python, Java, or Scala.
- Extensive experience with big data technologies such as Hadoop, Spark, and Hive.
- Proficiency in SQL and experience with both relational and NoSQL databases.
- Experience with cloud platforms (AWS, Azure, or GCP) and their data services.
- Knowledge of data modeling, data warehousing, and ETL best practices.
- Familiarity with data visualization tools (e.g., Tableau, Power BI).
- Experience with version control systems (e.g., Git) and CI/CD pipelines.
- Strong problem-solving skills and attention to detail.
- Excellent communication and collaboration skills.
ทักษะ:
Automation, Product Owner, Python
ประเภทงาน:
งานประจำ
เงินเดือน:
สามารถต่อรองได้
- The candidate will be responsible for design, and implement new solutions for complex data ingestions from multiple sources to enterprise data products with focus on automation, performance, resilience and scalability, etc.
- Partner with Lead Architect, Data Product Manager (Product Owner) and Lead Data Integration Engineer to create strategic solutions introducing new technologies.
- Work with stakeholders including Management, Domain leads, and Teams to assist with data-related technical issues and support their data infrastructure needs.
- Strong development & programming experience in Informatica (IICS), Python, ADF, Azure Synapse, snowflake, Cosmos and Databricks.
- Solid understanding of databases, real-time integration patterns and ETL/ELT best practices.
- Defining data retention policies, monitoring performance and advising any necessary infrastructure changes based on functional and non-functional requirements.
- Responsible for ensuring enterprise data policies, best practices, standards and processes are followed.
- Write up and maintain technical specifications, design documents and process flow.
- Mentor a team of onshore and offshore development resources to analyze, design, construct and test software development projects focused on analytics and data integration.
- Elaborate user stories for technical team and ensure that the team understands the deliverables.
- Effectively communicate, coordinate & collaborate with business, IT architecture and data teams across multi-functional areas to complete deliverables.
- Provide direction to the Agile development team and stakeholders throughout the project.
- Assist in Data Architecture design, tool selection and data flows analysis.
- Work with large amounts of data, interpret data, analyze results, perform gap analysis and provide ongoing reports.
- Handle ad-hoc analysis & report generation requests from the business.
- Respond to data related inquiries to support business and technical teams.
- 6+years of proven working experience in ETL methodologies, Data integration and data migration. Informatica IICS, Databricks/Spark & Python hands-on development skills a must.
- Clear hands-on experience with database systems - SQL server, Oracle, Azure Synapse, Snowflake and Cosmos, Cloud technologies (e.g., AWS, Azure, Google), and NoSQL databases (e.g., Cosmos, MongoDB, DynamoDB).
- Extensive experience developing complex solutions focused on data ecosystem solutions.
- Extensive knowledge of data and analytics framework supporting data lakes, warehouses, marts, reporting, etc.
- In depth knowledge of data engineering and architecture disciplines.
- Extensive experience working with Big Data tools and building data solutions for advanced analytics.
- Solid understanding of P&C Insurance data.
- Technical expertise regarding data architecture, models and database design development.
- Strong knowledge of and experience with Java, SQL, XML s, Python, ETL frameworks and Databricks.
- Working knowledge/familiarity with Git version control.
- Strong Knowledge of analyzing datasets using Excel.
- Strong analytical skills with the ability to collect, organize, analyze, and disseminate significant amounts of information with attention to detail and accuracy.
- Proficient in learning new technologies with the ability to quickly understand capabilities and work with others to guide these into development.
- Good communication and presentation skills.
- Solid problem solving, decision making and analytical skills.
- Knowledge & working experience with Duck creek is an added plus.
- Knowledge & working experience with Insurity Policy Decisions and/or IEV is an added plus.
- Experience with JIRA.
- Experience being part of high-performance agile teams in a fast-paced environment.
- Must understand the system scope and project objectives to achieve project needs through matrix management and collaboration with other enterprise teams.
- Proven ability to produce results in the analysis, design, testing and deployment of applications.
- Strong team emphasis and relationship building skills; partners well with business and other IT/Data areas.
- Strong coaching / mentoring skills.
- Applies technical knowledge to determine solutions and solve complex problems.
- Ability to be proactive, self-motivated, detail-oriented, creative, inquisitive and persistent.
- Excellent communication and negotiation skills.
- Ability to organize, plan and implement work assignments, juggle competing demands and work under pressure of frequent and tight deadlines.
- The candidate will be responsible for design, and implement new solutions for complex data ingestions from multiple sources to enterprise data products with focus on automation, performance, resilience and scalability, etc.
- Partner with Lead Architect, Data Product Manager (Product Owner) and Lead Data Integration Engineer to create strategic solutions introducing new technologies.
- Work with stakeholders including Management, Domain leads, and Teams to assist with data-related technical issues and support their data infrastructure needs.
- Strong development & programming experience in Informatica (IICS), Python, ADF, Azure Synapse, snowflake, Cosmos and Databricks.
- Solid understanding of databases, real-time integration patterns and ETL/ELT best practices.
- Defining data retention policies, monitoring performance and advising any necessary infrastructure changes based on functional and non-functional requirements.
- Responsible for ensuring enterprise data policies, best practices, standards and processes are followed.
- Write up and maintain technical specifications, design documents and process flow.
- Mentor a team of onshore and offshore development resources to analyze, design, construct and test software development projects focused on analytics and data integration.
- Elaborate user stories for technical team and ensure that the team understands the deliverables.
- Effectively communicate, coordinate & collaborate with business, IT architecture and data teams across multi-functional areas to complete deliverables.
- Provide direction to the Agile development team and stakeholders throughout the project.
- Assist in Data Architecture design, tool selection and data flows analysis.
- Work with large amounts of data, interpret data, analyze results, perform gap analysis and provide ongoing reports.
- Handle ad-hoc analysis & report generation requests from the business.
- Respond to data related inquiries to support business and technical teams.
- 6+years of proven working experience in ETL methodologies, Data integration and data migration. Informatica IICS, Databricks/Spark & Python hands-on development skills a must.
- Clear hands-on experience with database systems - SQL server, Oracle, Azure Synapse, Snowflake and Cosmos, Cloud technologies (e.g., AWS, Azure, Google), and NoSQL databases (e.g., Cosmos, MongoDB, DynamoDB).
- Extensive experience developing complex solutions focused on data ecosystem solutions.
- Extensive knowledge of data and analytics framework supporting data lakes, warehouses, marts, reporting, etc.
- In depth knowledge of data engineering and architecture disciplines.
- Extensive experience working with Big Data tools and building data solutions for advanced analytics.
- Solid understanding of P&C Insurance data.
- Technical expertise regarding data architecture, models and database design development.
- Strong knowledge of and experience with Java, SQL, XML s, Python, ETL frameworks and Databricks.
- Working knowledge/familiarity with Git version control.
- Strong Knowledge of analyzing datasets using Excel.
- Strong analytical skills with the ability to collect, organize, analyze, and disseminate significant amounts of information with attention to detail and accuracy.
- Proficient in learning new technologies with the ability to quickly understand capabilities and work with others to guide these into development.
- Good communication and presentation skills.
- Solid problem solving, decision making and analytical skills.
- Knowledge & working experience with Duck creek is an added plus.
- Knowledge & working experience with Insurity Policy Decisions and/or IEV is an added plus.
- Experience with JIRA.
- Experience being part of high-performance agile teams in a fast-paced environment.
- Must understand the system scope and project objectives to achieve project needs through matrix management and collaboration with other enterprise teams.
- Proven ability to produce results in the analysis, design, testing and deployment of applications.
- Strong team emphasis and relationship building skills; partners well with business and other IT/Data areas.
- Strong coaching / mentoring skills.
- Applies technical knowledge to determine solutions and solve complex problems.
- Ability to be proactive, self-motivated, detail-oriented, creative, inquisitive and persistent.
- Excellent communication and negotiation skills.
- Ability to organize, plan and implement work assignments, juggle competing demands and work under pressure of frequent and tight deadlines.
ทักษะ:
Sales, Salesforce, Java
ประเภทงาน:
งานประจำ
เงินเดือน:
สามารถต่อรองได้
- Work alongside the wider team, lead the overall technology solution, planning and estimation for complex projects.
- Drive innovation and continuous improvement in design and delivery practices of our solutions, including Salesforce best practice in the implementation of client solutions.
- Manage the delivery teams to deliver full Salesforce lifecycle implementations, with a focus on the client success but awareness of other wider business and technology ...
- Act as a role model for the team by always demonstrating the highest standards in business, digital led transformation.
- Conduct quality reviews of our implementation to ensure they meet our high standards.
- Lead end-to-end pre-sales activities.
- Provide leadership and support for delivery teams and staff in local offices.
- Your role as a leaderAt Deloitte, we believe in the importance of empowering our people to be leaders at all levels. We connect our purpose and shared values to identify issues as well as to make an impact that matters to our clients, people and the communities. Additionally, Managers across our Firm are expected to:Develop diverse, high-performing people and teams through new and meaningful development opportunities.
- Collaborate effectively to build productive relationships and networks.
- Understand and lead the execution of key objectives and priorities for internal as well as external stakeholders.
- Align your team to key objectives as well as set clear priorities and direction.
- Make informed decisions that positively impact the sustainable financial performance and enhance the quality of outcomes.
- Influence stakeholders, teams, and individuals positively - leading by example and providing equal opportunities for our people to grow, develop and succeed.
- Lead with integrity and make a strong positive impact by energising others, valuing individual differences, recognising contributions, and inspiring self-belief.
- Deliver superior value and high-quality results to stakeholders while driving high performance from people across Deloitte.
- Apply their understanding of disruptive trends and competitor activity to recommend changes, in line with leading practices.
- Requirements:8+ years CRM experience with a minimum of 4 years on the Salesforce core platform and Salesforce Marketing Cloud.
- At least 4 full life-cycle Salesforce implementation with strong expertise as well as certifications in the following modules: Sales Cloud, Service Cloud, Marketing Cloud, Community Cloud, Force.com, Apttus.
- Development and troubleshooting experience with Salesforce (Apex, Visualforce, Lightning, Java/C#/OOP, Javascript/JQuery, Angular, JS/Bootstrap, SQL/SOQL, Web Services) will be preferred.
- Lead technical design sessions with client s technical team/architects; architect and document technical solutions aligned with client business objectives; identify gaps between client's current and desired end states.
- Strong understanding of Agile / Iterative delivery methodology.
- Knowledge of data integration tools and experience integrating Salesforce with different business systems (ETL, CPQ, marketing automation, reporting, etc.).
- Understanding of systems architecture and ability to design scalable performance-driven solutions.
- Familiarity with platform authentication patterns (SAML, SSO, OAuth).
- Strong understanding of environment management, release management, code versioning best practices, and deployment methodologies.
- Responsible for deliverable for project. Capacity plan for specific plan, managing the development team.
- Ensure utilization of staff is optimized by tracking individual team member forecast.
- Allocating resources and responsibilities across the team to deliver business results and develop team members.
- Responsible for supporting quality programs throughout the entire SDLC period.
- Experience with Wave Analytics, Lightening, Blue Kai, Eloqua, Exact Target or Marketo will be a bonus.
- An appreciation of the consulting lifestyle and ability to travel (both locally and abroad) is a pre-requisite to fit to our short-term and long-term project assignment.
- Due to volume of applications, we regret that only shortlisted candidates will be notified.
- Please note that Deloitte will never reach out to you directly via messaging platforms to offer you employment opportunities or request for money or your personal information. Kindly apply for roles that you are interested in via this official Deloitte website.
- LI_MH Requisition ID: 104113In Thailand, the services are provided by Deloitte Touche Tohmatsu Jaiyos Co., Ltd. and other related entities in Thailand ("Deloitte in Thailand"), which are affiliates of Deloitte Southeast Asia Ltd. Deloitte Southeast Asia Ltd is a member firm of Deloitte Touche Tohmatsu Limited. Deloitte in Thailand, which is within the Deloitte Network, is the entity that is providing this Website.
ทักษะ:
Big Data, Java, Python
ประเภทงาน:
งานประจำ
เงินเดือน:
สามารถต่อรองได้
- Background in programming, databases and/or big data technologies OR.
- BS/MS in software engineering, computer science, economics or other engineering fields.
- Partner with Data Architect and Data Integration Engineer to enhance/maintain optimal data pipeline architecture aligned to published standards.
- Assemble medium, complex data sets that meet functional /non-functional business requirements.
- Design and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
- Build the infrastructure required for optimal extraction transformation, and loading of data from a wide variety of data sources big data technologies.
- Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics.
- Work with stakeholders including Domain leads, and Teams to assist with data-related technical issues and support their data infrastructure needs.
- Ensure technology footprint adheres to data security policies and procedures related to encryption, obfuscation and role based access.
- Create data tools for analytics and data scientist team members.
- Functional Competency.
- Knowledge of data and analytics framework supporting data lakes, warehouses, marts, reporting, etc.
- Defining data retention policies, monitoring performance and advising any necessary infrastructure changes based on functional and non-functional requirements.
- In depth knowledge of data engineering discipline.
- Extensive experience working with Big Data tools and building data solutions for advanced analytics.
- Minimum of 5+ years' hands-on experience with a strong data background.
- Solid programming skills in Java, Python and SQL.
- Clear hands-on experience with database systems - Hadoop ecosystem, Cloud technologies (e.g. AWS, Azure, Google), in-memory database systems (e.g. HANA, Hazel cast, etc) and other database systems - traditional RDBMS (e.g. Teradata, SQL Server, Oracle), and NoSQL databases (e.g. Cassandra, MongoDB, DynamoDB).
- Practical knowledge across data extraction and transformation tools - traditional ETL tools (e.g. Informatica, Ab Initio, Altryx) as well as more recent big data tools.
- Educational.
- Background in programming, databases and/or big data technologies OR.
- BS/MS in software engineering, computer science, economics or other engineering fields.
- Partner with Data Architect and Data Integration Engineer to enhance/maintain optimal data pipeline architecture aligned to published standards.
- Assemble medium, complex data sets that meet functional /non-functional business requirements.
- Design and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
- Build the infrastructure required for optimal extraction transformation, and loading of data from a wide variety of data sources big data technologies.
- Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics.
- Work with stakeholders including Domain leads, and Teams to assist with data-related technical issues and support their data infrastructure needs.
- Ensure technology footprint adheres to data security policies and procedures related to encryption, obfuscation and role based access.
- Create data tools for analytics and data scientist team members.
- Functional Competency.
- Knowledge of data and analytics framework supporting data lakes, warehouses, marts, reporting, etc.
- Defining data retention policies, monitoring performance and advising any necessary infrastructure changes based on functional and non-functional requirements.
- In depth knowledge of data engineering discipline.
- Extensive experience working with Big Data tools and building data solutions for advanced analytics.
- Minimum of 5+ years' hands-on experience with a strong data background.
- Solid programming skills in Java, Python and SQL.
- Clear hands-on experience with database systems - Hadoop ecosystem, Cloud technologies (e.g. AWS, Azure, Google), in-memory database systems (e.g. HANA, Hazel cast, etc) and other database systems - traditional RDBMS (e.g. Teradata, SQL Server, Oracle), and NoSQL databases (e.g. Cassandra, MongoDB, DynamoDB).
- Practical knowledge across data extraction and transformation tools - traditional ETL tools (e.g. Informatica, Ab Initio, Altryx) as well as more recent big data tools.
ทักษะ:
ETL, Java, Python
ประเภทงาน:
งานประจำ
เงินเดือน:
สามารถต่อรองได้
- Design, develop, optimize, and maintain data architecture and pipelines that adhere to ETL principles and business goals.
- Create data products for analytics and data scientist team members to improve their productivity.
- Lead the evaluation, implementation and deployment of emerging tools and process for analytic data engineering in order to improve our productivity as a team.
- Develop and deliver communication and education plans on analytic data engineering capabilities, standards, and processes.
- RequirementsPrevious experience as a data engineer or in a similar role.
- Technical expertise with data models, data mining, and segmentation techniques.
- Knowledge of programming languages (e.g. Java and Python).
- Hands-on experience with SQL database design using Hadoop or BigQuery and experience with a variety of relational, NoSQL, and cloud database technologies.
- Great numerical and analytical skill.
- Worked with BI tools such as Tableau, Power BI.
- Conceptual knowledge of data and analytics, such as dimensional modeling, ETL, reporting tools, data governance, data warehousing, structured and unstructured data.
ทักษะ:
Business Development, Creative Thinking, Project Management, English
ประเภทงาน:
งานประจำ
เงินเดือน:
สามารถต่อรองได้
- Strong analytical and problem-solving skills to identify commercial opportunity.
- Working well in a cross-disciplinary team with different types of stakeholders (IT, Agency, Business, Management).
- Business Development of Data Intelligence for corporate strategy.
- Analyze internal and external data in various aspects to identify threats & opportunities and provide information/report for management or related business unit team to plan activities and strategies.
- Participate in the initiative's development plan of business unit / brand plans and align with corporate strategy, objectives and KPIs.
- Coordinate and consolidate with the related department to implement a project or tracking the project progress and provide corrective supervision if necessary.
- Create and deliver insights report on new ideas to the management team or business units and seek appropriate decisions, directions, and approvals.
- Bachelor s or Master s degree in business or related field of study.
- Minimum 5-8 years Performance Management function / Commercial Analyst roles.
- Experience in corporate/channel/brand/segment strategy.
- Experience working in Data Analytic related projects.
- Excellent analytical and problem-solving skills.
- Ability to apply logical and creative thinking to solve complex business problems.
- Ability to define the problems and frame answers in a logical and structured manner.
- Good project management, team leadership and sense of ownership.
- Good co-ordination skill with positive attitude and ability to work under pressure.
- Strong communications, customer relationship and negotiation skills.
- Good command of both written and spoken English.
- TECHNICAL SKILLS: Basic understanding of data ecosystem, Advanced skills in dashboard and BI tools.
- Conceptual knowledge of data and analytics, ETL, reporting tools, data governance, data warehousing, structured and unstructured data.
ทักษะ:
Automation, SQL, Data Warehousing
ประเภทงาน:
งานประจำ
เงินเดือน:
สามารถต่อรองได้
- Act as the first point of contact for users facing issues related to data and reporting.
- Manage, track, and resolve incidents, service requests, and inquiries via the ticketing system.
- Classify and prioritize incoming tickets based on severity, impact, and urgency.
- Respond to and resolve user tickets in a timely and efficient manner.
- Escalate unresolved or complex issues to appropriate internal teams while maintaining clear communication with the users.
- Diagnose, troubleshoot, and resolve data-related issues, including reporting errors, data discrepancies, and system malfunctions.
- Collaborate with other teams (data engineers, data scientists, data analysts, and other IT teams) to address complex issues.
- Provide clear and comprehensive updates to users on incident status and resolution timelines.
- Provide technical support to end-users via phone, email, chat, and ticketing system.
- Process user requests for new reports, data extracts, or updates to existing data views.
- Coordinate with relevant stakeholders to ensure requests are completed accurately and efficiently.
- Respond to user inquiries about reporting tools, data access, and system functionalities.
- Provide guidance and training to users on self-service reporting tools and best practices.
- Maintain an updated knowledge base for frequently asked questions and user guidance.
- Contribute to the development and maintenance of knowledge base articles.
- Analyze recurring issues and recommend changes to improve system stability and user experience.
- Collaborate with development and data teams to identify opportunities for automation and improved processes.
- Collaborate with other teams to improve system performance and user experience.
- Provide on-call support during evenings, weekends, or holidays as required.
- RequirementsBachelor's degree in Computer Science, Information Technology, or related field.
- Proficiency in SQL and database querying for troubleshooting and resolving data-related issues.
- Strong understanding of database management and concepts.
- Knowledge of data warehousing concepts and ETL processes.
- Experience with business intelligence and data visualization tools (e.g., Power BI, Oracle OBIEE, Oracle BIP).
- Familiarity with data visualization and reporting systems.
- Experience with cloud platforms (AWS, Azure, GCP, or Oracle Cloud).
- Strong analytical and problem-solving skills.
- Excellent communication skills, both written and verbal, with the ability to explain technical concepts to non-technical users.
- Ability to manage multiple tasks, prioritize effectively, and work under pressure.
- Strong customer service orientation and detail-oriented with a focus on delivering high-quality results.
- 1