- No elements found. Consider changing the search query.
ทักษะ:
SQL, Oracle, Data Warehousing
ประเภทงาน:
งานประจำ
เงินเดือน:
สามารถต่อรองได้
- Bachelor s degree in Computer Science, Information Systems, Engineering, or a related field.
- At least 7 years of experience as a Data Engineer or in a related role.
- Hands-on experience with SQL, database management (e.g., Oracle, SQL Server, PostgreSQL), and data warehousing concepts.
- Experience with ETL/ELT tools such as Talend, Apache NiFi, or similar.
- Proficiency in programming languages like Python, Java, or Scala for data manipulation and automation.
- Experience with cloud platforms such as AWS, Azure, or GCP.
- Knowledge of big data technologies such as Hadoop, Spark, or Kafka.
- Strong understanding of data governance, security, and privacy frameworks in a financial services context.
- Excellent problem-solving skills and attention to detail.
- Experience working with Data Visualization or BI tools like Power BI, Tableau.
- Familiarity with machine learning concepts, model deployment, and AI applications.
- Banking or financial services industry experience, especially in retail or wholesale banking data solutions.
- Certification in cloud platforms (e.g., AWS Certified Data Engineer, Microsoft Azure Data Engineer, Google Professional Data Engineer)..
- Contact:.
- ท่านสามารถอ่านและศึกษานโยบายความเป็นส่วนตัวของธนาคารกรุงไทย จำกัด (มหาชน) ที่ https://krungthai.com/th/content/privacy-policy ทั้งนี้ ธนาคารไม่มีเจตนาหรือความจำเป็นใดๆ ที่จะประมวลผลข้อมูลส่วนบุคคลที่มีความอ่อนไหว รวมถึงข้อมูลที่เกี่ยวข้องศาสนาและ/หรือหมู่โลหิต ซึ่งอาจปรากฏอยู่ในสำเนาบัตรประจำตัวประชาชนของท่านแต่อย่างใด ดังนั้น กรุณาอย่าอัปโหลดเอกสารใดๆ รวมถึงสำเนาบัตรประจำตัวประชาชน หรือกรอกข้อมูลส่วนบุคคลที่มีความอ่อนไหวหรือข้อมูลอื่นใด ซึ่งไม่เกี่ยวข้องหรือไม่จำเป็นสำหรับวัตถุประสงค์ในการสมัครงานไว้บนเว็บไซต์ นอกจากนี้ กรุณาดำเนินการให้แน่ใจว่าได้ดำเนินการลบข้อมูลส่วนบุคคลที่มีความอ่อนไหว (ถ้ามี) ออกจากเรซูเม่และเอกสารอื่นใดก่อนที่จะอัปโหลดเอกสารดังกล่าวไว้บนเว็บไซต์แล้วด้วย ทั้งนี้ ธนาคารมีความจำเป็นต้องเก็บรวบรวมข้อมูลส่วนบุคคลเกี่ยวกับประวัติอาชญากรรมของท่านเพื่อบรรลุวัตถุประสงค์ในการพิจารณารับบุคคลเข้าทำงาน หรือการตรวจสอบคุณสมบัติ ลักษณะต้องห้าม หรือพิจารณาความเหมาะสมของบุคคลที่จะให้ดำรงตำแหน่ง ซึ่งการให้ความยินยอมเพื่อเก็บรวบรวม ใช้ หรือเปิดเผยข้อมูลส่วนบุคคลเกี่ยวกับประวัติอาชญากรรมของท่านมีความจำเป็นสำหรับการเข้าทำสัญญาและการได้รับการพิจารณาตามวัตถุประสงค์ดังกล่าวข้างต้น ในกรณีที่ท่านไม่ให้ความยินยอมในการเก็บรวบรวม ใช้ หรือเปิดเผยข้อมูลส่วนบุคคลเกี่ยวกับประวัติอาชญากรรม หรือมีการถอนความยินยอมในภายหลัง ธนาคารอาจไม่สามารถดำเนินการเพื่อบรรลุวัตถุประสงค์ดังกล่าวข้างต้นได้ และอาจ ทำให้ท่านสูญเสียโอกาสในการได้รับการพิจารณารับเข้าทำงานกับธนาคาร .
ทักษะ:
ETL, DevOps, Automation
ประเภทงาน:
งานประจำ
เงินเดือน:
สามารถต่อรองได้
- Data Pipeline Development: Design, implement, and optimize scalable ETL/ELT pipelines to ingest, transform, and store structured and unstructured data in a cloud environment (AWS is a core but not limit).
- Machine Learning Pipeline Development: Work collaboratively with data scientists to productionize and maintain scalable machine learning services. The solutions encompass a variety of approaches, including traditional and near real-time machine learning, deployed across multi-state service architectures.
- Data Platform: Collaborate closely with DevOps and infrastructure teams to design, implement, and manage scalable data storage and processing platforms. Leverage AWS services such as S3, Redshift, Glue, Lambda, Athena, and EMR to ensure performance, reliability, and cost-efficiency.
- Data Modeling and Schema Management: Develop and maintain robust data models and schemas to support analytics, reporting, and operational requirements. Adhere to the design principle of establishing a "single version of truth" to ensure consistency, accuracy, and reliability across all data-driven processes.
- Data/AI Quality-as-a-Service Development: Design, develop, and maintain scalable "Data/AI Quality-as-a-Service" solutions, adhering to zero-ops design principles. The scope of quality includes monitoring data drift, analyzing performance metrics, and detecting model drift to ensure consistent, reliable, and high-performing AI systems.
- Cross-Functional Collaboration: Collaborate closely with data scientists, analysts, and application developers to ensure the seamless integration of data solutions into workflows, enhancing functionality and enabling data-driven decision-making.
- Automation & Monitoring: Design and implement robust monitoring and automation frameworks to ensure the high availability, performance, and cost-efficiency of data workflows, guided by the principle of "Zero Ops by Design.".
- Compliance & Security: Uphold data security, privacy, and compliance with banking regulations and industry standards, ensuring all solutions meet rigorous governance requirements.
- Continuous Improvement: Stay informed about emerging technologies and trends in cloud data engineering, advocating for their adoption to enhance system capabilities and maintain a competitive edge.
- Educational BackgroundBachelor's degree in Computer Science, Computer Engineering, Data Engineering, or a related field.
- Experience3+ years of experience in cloud data engineering or similar roles.
- Proven expertise in cloud data technologies.
- Hands-on experience with big data technologies such as Apache Spark.
- Technical SkillsProficiency in SQL and programming languages such as Python, Java, or Scala.
- Expertise in data pipeline and workflow orchestration tools for both batch and real-time processing (e.g., Apache Airflow, AWS Step Functions).
- Understanding of data warehouse and lakehouse architectures.
- Familiarity with software development best practices, including SDLC concepts, CI/CD/(+CL) pipelines, and Infrastructure as Code tools (e.g., Terraform, AWS CloudFormation).
- Other SkillsStrong problem-solving and analytical thinking capabilities.
- Excellent communication and collaboration skills.
- Preferred QualificationsAWS Data Analytics - Specialty certification or equivalent experience.
- Experience in banking or fintech environments. Understanding of financial data and regulatory requirements.
- Familiarity with real-time data processing and stream analytics.
- Experience in working end-to-end with data scientists and analysts as part of "AnalyticsOps" to develop and maintain ML/AI services is a strong advantage.
- ท่านสามารถอ่านและศึกษานโยบายความเป็นส่วนตัวของธนาคารกรุงไทย จำกัด (มหาชน) ที่ https://krungthai.com/th/content/privacy-policy ทั้งนี้ ธนาคารไม่มีเจตนาหรือความจำเป็นใดๆ ที่จะประมวลผลข้อมูลส่วนบุคคลที่มีความอ่อนไหว รวมถึงข้อมูลที่เกี่ยวข้องศาสนาและ/หรือหมู่โลหิต ซึ่งอาจปรากฏอยู่ในสำเนาบัตรประจำตัวประชาชนของท่านแต่อย่างใด ดังนั้น กรุณาอย่าอัปโหลดเอกสารใดๆ รวมถึงสำเนาบัตรประจำตัวประชาชน หรือกรอกข้อมูลส่วนบุคคลที่มีความอ่อนไหวหรือข้อมูลอื่นใด ซึ่งไม่เกี่ยวข้องหรือไม่จำเป็นสำหรับวัตถุประสงค์ในการสมัครงานไว้บนเว็บไซต์ นอกจากนี้ กรุณาดำเนินการให้แน่ใจว่าได้ดำเนินการลบข้อมูลส่วนบุคคลที่มีความอ่อนไหว (ถ้ามี) ออกจากเรซูเม่และเอกสารอื่นใดก่อนที่จะอัปโหลดเอกสารดังกล่าวไว้บนเว็บไซต์แล้วด้วย ทั้งนี้ ธนาคารมีความจำเป็นต้องเก็บรวบรวมข้อมูลส่วนบุคคลเกี่ยวกับประวัติอาชญากรรมของท่านเพื่อบรรลุวัตถุประสงค์ในการพิจารณารับบุคคลเข้าทำงาน หรือการตรวจสอบคุณสมบัติ ลักษณะต้องห้าม หรือพิจารณาความเหมาะสมของบุคคลที่จะให้ดำรงตำแหน่ง ซึ่งการให้ความยินยอมเพื่อเก็บรวบรวม ใช้ หรือเปิดเผยข้อมูลส่วนบุคคลเกี่ยวกับประวัติอาชญากรรมของท่านมีความจำเป็นสำหรับการเข้าทำสัญญาและการได้รับการพิจารณาตามวัตถุประสงค์ดังกล่าวข้างต้น ในกรณีที่ท่านไม่ให้ความยินยอมในการเก็บรวบรวม ใช้ หรือเปิดเผยข้อมูลส่วนบุคคลเกี่ยวกับประวัติอาชญากรรม หรือมีการถอนความยินยอมในภายหลัง ธนาคารอาจไม่สามารถดำเนินการเพื่อบรรลุวัตถุประสงค์ดังกล่าวข้างต้นได้ และอาจ ทำให้ท่านสูญเสียโอกาสในการได้รับการพิจารณารับเข้าทำงานกับธนาคาร".
ทักษะ:
Data Analysis, SQL, Problem Solving, English
ประเภทงาน:
งานประจำ
เงินเดือน:
สามารถต่อรองได้
- Working closely with business and technical domain experts to identify data requirements that are relevant for analytics and business intelligence.
- Implement data solutions and data comprehensiveness for data customers.
- Working closely with engineering to ensure data service solutions are ultimately delivered in a timely and cost effective manner.
- Retrieve and prepare data (automated if possible) to support business data analysis.
- Ensure adherence to the highest standards in data privacy protection and data governance.
- Bachelor s of Master s Degree in Computer Science, Computer Engineering, or related.
- Minimum of 1 Year with relational/non-relational database systems and good command in SQL.
- Ability to meet critical deadlines and prioritize multiple tasks in a fast-paced environment.
- Ability to work independently, have strong problem solving and organization skills, with a high initiative and a sense of accountability and ownership.
- Experience with cloud-based platforms such as AWS, Google Cloud platform or similar.
- Experience in data processing batch / real time / nearly realtime.
- Experience with data integration or ETL management tools such as AWS Glue, Databrick,or similar.
- Experience with web or software development with Java,Python or similar.
- Experience with Agile methodology is a plus.
- Good in communication and writing in English.
- Good interpersonal and communication skills.
ประสบการณ์:
3 ปีขึ้นไป
ทักษะ:
Big Data, Hive, SAS
ประเภทงาน:
งานประจำ
เงินเดือน:
สามารถต่อรองได้
- Design, implement, and maintain data analytics pipelines and processing systems.
- Experience of data modelling techniques and integration patterns.
- Write data transformation jobs through code.
- Analyze large datasets to extract insights and identify trends.
- Perform data management through data quality tests, monitoring, cataloging, and governance.
- Knowledge in data infrastructure ecosystem.
- Collaborate with cross-functional teams to identify opportunities to leverage data to drive business outcomes.
- Build data visualizations to communicate findings to stakeholders.
- A willingness to learn and find solutions to complex problems.
- Stay up-to-date with the latest developments in data analytics and science.
- Experience migrating from on-premise data stores to cloud solutions.
- Knowledge of system design and platform thinking to build sustainable solution.
- Practical experience with modern and traditional Big Data stacks (e.g BigQuery, Spark, Databricks, duckDB, Impala, Hive, etc).
- Experience working with data warehouse solutions ELT solutions, tools, and techniques (e.g. Airflow, dbt, SAS, Matillion, Nifi).
- Experience with agile software delivery and CI/CD processes.
- Bachelor's or Master's degree in computer science, statistics, engineering, or a related field.
- At least 3 years of experience in data analysis and modeling.
- Proficiency in Python, and SQL.
- Experience with data visualization tools such as Tableau, Grafana or similar.
- Familiarity with cloud computing platforms, such as GCP, AWS or Databricks.
- Strong problem-solving skills and the ability to work independently as well as collaboratively.
- This role offers a clear path to advance into machine learning and AI with data quality and management, providing opportunities to work on innovative projects and develop new skills in these exciting fields..
- Contact: [email protected] (K.Thipwimon).
- ท่านสามารถอ่านและศึกษานโยบายความเป็นส่วนตัวของธนาคารกรุงไทย จำกัด (มหาชน) ที่ https://krungthai.com/th/content/privacy-policy ทั้งนี้ ธนาคารไม่มีเจตนาหรือความจำเป็นใดๆ ที่จะประมวลผลข้อมูลส่วนบุคคลที่มีความอ่อนไหว รวมถึงข้อมูลที่เกี่ยวข้องศาสนาและ/หรือหมู่โลหิต ซึ่งอาจปรากฏอยู่ในสำเนาบัตรประจำตัวประชาชนของท่านแต่อย่างใด ดังนั้น กรุณาอย่าอัปโหลดเอกสารใดๆ รวมถึงสำเนาบัตรประจำตัวประชาชน หรือกรอกข้อมูลส่วนบุคคลที่มีความอ่อนไหวหรือข้อมูลอื่นใด ซึ่งไม่เกี่ยวข้องหรือไม่จำเป็นสำหรับวัตถุประสงค์ในการสมัครงานไว้บนเว็บไซต์ นอกจากนี้ กรุณาดำเนินการให้แน่ใจว่าได้ดำเนินการลบข้อมูลส่วนบุคคลที่มีความอ่อนไหว (ถ้ามี) ออกจากเรซูเม่และเอกสารอื่นใดก่อนที่จะอัปโหลดเอกสารดังกล่าวไว้บนเว็บไซต์แล้วด้วย ทั้งนี้ ธนาคารมีความจำเป็นต้องเก็บรวบรวมข้อมูลส่วนบุคคลเกี่ยวกับประวัติอาชญากรรมของท่านเพื่อบรรลุวัตถุประสงค์ในการพิจารณารับบุคคลเข้าทำงาน หรือการตรวจสอบคุณสมบัติ ลักษณะต้องห้าม หรือพิจารณาความเหมาะสมของบุคคลที่จะให้ดำรงตำแหน่ง ซึ่งการให้ความยินยอมเพื่อเก็บรวบรวม ใช้ หรือเปิดเผยข้อมูลส่วนบุคคลเกี่ยวกับประวัติอาชญากรรมของท่านมีความจำเป็นสำหรับการเข้าทำสัญญาและการได้รับการพิจารณาตามวัตถุประสงค์ดังกล่าวข้างต้น ในกรณีที่ท่านไม่ให้ความยินยอมในการเก็บรวบรวม ใช้ หรือเปิดเผยข้อมูลส่วนบุคคลเกี่ยวกับประวัติอาชญากรรม หรือมีการถอนความยินยอมในภายหลัง ธนาคารอาจไม่สามารถดำเนินการเพื่อบรรลุวัตถุประสงค์ดังกล่าวข้างต้นได้ และอาจ ทำให้ท่านสูญเสียโอกาสในการได้รับการพิจารณารับเข้าทำงานกับธนาคาร .
ทักษะ:
ETL, Big Data, SQL
ประเภทงาน:
งานประจำ
เงินเดือน:
สามารถต่อรองได้
- Design and develop ETL solutions using data integration tools for interfacing between source application and the Enterprise Data Warehouse.
- Experience with Big Data or data warehouse.
- Analyze & translate functional specifications & change requests into technical specifications.
- Experience in SQL programming in one of these RDBMSs such as Oracle.
- Develops ETL technical specifications, designs, develops, tests, implements, and supports optimal data solutions.
- Develops Documents ETL data mappings, data dictionaries, processes, programs and solutions as per established standards for data governance.
- Design and create codes for all related data extraction, transformation and loading (ETL) into database under responsibilities.
- Creates, executes, and documents unit test plans for ETL and data integration processes and programs.
- Perform problem assessment, resolution and documentation in existing ETL packages, mapping and workflows in production.
- Performance tuning of the ETL process and SQL queries and recommend and implement ETL and query tuning techniques.
- Qualifications Bachelor s degree in Information Technology, Computer Science, Computer Engineering, or a related field.
- Experience with CI/CD tools (e.g., Jenkins, GitLab CI/CD) and automation frameworks.
- Proficiency in cloud platforms such as AWS, Azure, and associated services.
- Knowledge of IAC tools like Terraform or Azure ARM.
- Familiarity with monitoring and logging tools like Prometheus, Grafana, ELK stack, APM, etc.
- Good understanding of IT Operations.
- Strong problem-solving skills and the ability to troubleshoot complex issues.
- Excellent communication and teamwork skills to collaborate effectively across various teams.
- We're committed to bringing passion and customer focus to the business. If you like wild growth and working with happy, enthusiastic over-achievers, you'll enjoy your career with us.
ประสบการณ์:
3 ปีขึ้นไป
ทักษะ:
Microsoft Azure, SQL, UNIX, Python, Hadoop
ประเภทงาน:
งานประจำ
เงินเดือน:
สามารถต่อรองได้
- Develop data pipeline automation using Azure technologies, Databricks and Data Factory.
- Understand data, reports and dashboards requirements, develop data visualization using Power BI, Tableau by working across workstreams to support data requirements including reports and dashboards and collaborate with data scientists, data analyst, data governance team or business stakeholders on several projects.
- Analyze and perform data profiling to understand data patterns following Data Qualit ...
- 3 years+ experience in big data technology, data engineering, data analytic application system development.
- Have an experience of unstructured data for business Intelligence or computer science would be advantage.
- Technical skill in SQL, UNIX and Shell Script, Python, R Programming, Spark, Hadoop programming.
ประสบการณ์:
ไม่จำเป็นต้องมีประสบการณ์ทำงาน
ทักษะ:
Mechanical Engineering, Electrical Engineering, English
ประเภทงาน:
งานประจำ
- Provide day to day installation, maintenance, and repair of all facilities in the data center.
- 24x7 shift work responsibility when qualified and designated.
- Provide requested reporting and documentation.
- Support of facility, development, and construction teams.
- Perform tasks as assigned by DC operation manager.
- Respond to customer requests, power, cooling, and facility audits.
- First tier investigate any power, communication, or cooling anomalies.
- Attend assigned meetings and training.
- Assist in ensuring customer compliance with GSA Acceptance Usage Policy (AUP).
- Provide technical escort when needed.
- Job Qualifications.
- Must be familiar with safety requirements and OSHA regulations or Thailand safety regulations.
- Basic understanding of electrical and mechanical systems that may be employed in a data center environment. This may include electrical feeders, transformers, generators, switchgear, UPS systems, DC power systems, ATS/STS units, PDU units, air handling units, cooling towers, and fire suppression systems.
- Familiar with Interpret wiring diagrams, schematics, and electrical drawings.
- Ability to express ideas clearly, concisely, and effectively with contractors performing maintenance or upgrades on systems installed in the data center environment.
- Excellent verbal, written, and interpersonal communication skills.
- Ability to analyze and make suggestions for problem resolution.
- Solve problems with good initiative and sound judgment.
- Creativity, problem solving skills, negotiation and systematic thinking.
- Fluent in English both written and verbal (Minimum 500 TOEIC score).
- Goal-Oriented, Unity, Learning, Flexible.
ประสบการณ์:
5 ปีขึ้นไป
ทักษะ:
Scala, Java, Golang
ประเภทงาน:
งานประจำ
เงินเดือน:
สามารถต่อรองได้
- Lead the team technically in improving scalability, stability, accuracy, speed and efficiency of our existing Data systems.
- Build, administer and scale data processing pipelines.
- Be comfortable navigating the following technology stack: Scala, Spark, java, Golang, Python3, scripting (Bash/Python), Hadoop, SQL, S3 etc.
- Improve scalability, stability, accuracy, speed and efficiency of our existing data systems.
- Design, build, test and deploy new libraries, frameworks or full systems for our core systems while keeping to the highest standards of testing and code quality.
- Work with experienced engineers and product owners to identify and build tools to automate many large-scale data management / analysis tasks.
- What You'll need to Succeed.
- Bachelor's degree in Computer Science /Information Systems/Engineering/related field.
- 5+ years of experience in software and data engineering.
- Good experience in Apache Spark.
- Expert level understanding of JVM and either Java or Scala.
- Experience debugging and reasoning about production issues is desirable.
- A good understanding of data architecture principles preferred.
- Any other experience with Big Data technologies / tools.
- SQL experience.
- Analytical problem-solving capabilities & experience.
- Systems administration skills in Linux.
- It's great if you have.
- Good understanding of Hadoop ecosystems.
- Experience working with Open-source products.
- Python/Shell scripting skills.
- Working in an agile environment using test driven methodologies.
- Equal Opportunity Employer.
- At Agoda, we pride ourselves on being a company represented by people of all different backgrounds and orientations. We prioritize attracting diverse talent and cultivating an inclusive environment that encourages collaboration and innovation. Employment at Agoda is based solely on a person's merit and qualifications. We are committed to providing equal employment opportunity regardless of sex, age, race, color, national origin, religion, marital status, pregnancy, sexual orientation, gender identity, disability, citizenship, veteran or military status, and other legally protected characteristics.
- We will keep your application on file so that we can consider you for future vacancies and you can always ask to have your details removed from the file. For more details please read our privacy policy.
- To all recruitment agencies: Agoda does not accept third party resumes. Please do not send resumes to our jobs alias, Agoda employees or any other organization location. Agoda is not responsible for any fees related to unsolicited resumes.
ประสบการณ์:
3 ปีขึ้นไป
ทักษะ:
English
ประเภทงาน:
งานประจำ
เงินเดือน:
สามารถต่อรองได้
- Responsible for planning preventive maintenance schedules for air condition & Fire protection systems.
- Responsible for coordinating and managing vendors and suppliers to preventive maintenance and payment plans.
- 2nd Level support to Data Center Operation (FOC), on site to solve Incident and Problem management.
- 2nd Level support to engineer team all site, Data Center (TT1, TT2, MTG, BNA).
- To create & update reports and documents to comply with ISO 20k, 22k, 27k, 50k & TCOS standards.
- Review PUE, cost saving energy and report.
- Measured Efficiency air system and record annual report.
- Responsible for implementation of Mechanical such as comfort air, precision air.
- Responsible for implementation of Fire suppression such as FM200, NOVEC, CO2, Fire Sprinkler, Fire Pump, Fire alarm, and Vesda.
- Working period office time 9:00 - 18:00 and able to standby on call or onsite to work on holiday.
- Bachelor degree of Engineering, Mechanical engineering or related field.
- At Least 3 years experience in maintenance air conditioning such as comfort air, precision air, chiller air cool, water cool, pump motor: implement and support for mechanics-air condition systems in buildings or Data Centers.
- At least 1 years experience in designing air conditioners such as comfort air, precision air, chiller air cool, water cool, pump motor: implement, and support for mechanics-air condition systems in building.
- Able to Air - Diagram and Psychrometric chart knowledge.
- Able to work as a team and work in and standby on call on holiday.
- Able to work overtime if required and a hotline arrives (Less than 1 hour on site from your home).
- Proficiency in English communication is beneficial.
- Work Location: TrueIDC - Bangna Site (KM26).
ประสบการณ์:
5 ปีขึ้นไป
ทักษะ:
Python, ETL, Compliance
ประเภทงาน:
งานประจำ
เงินเดือน:
สามารถต่อรองได้
- Design and implement scalable, reliable, and efficient data pipelines for ingesting, processing, and storing large amounts of data from a variety of sources using cloud-based technologies, Python, and PySpark.
- Build and maintain data lakes, data warehouses, and other data storage and processing systems on the cloud.
- Write and maintain ETL/ELT jobs and data integration scripts to ensure smooth and accurate data flow.
- Implement data security and compliance measures to protect data privacy and ensure regulatory compliance.
- Collaborate with data scientists and analysts to understand their data needs and provide them with access to the required data.
- Stay up-to-date on the latest developments in cloud-based data engineering, particularly in the context of Azure, AWS and GCP, and proactively bring new ideas and technologies to the team.
- Monitor and optimize the performance of data pipelines and systems, identifying and resolving any issues or bottlenecks that may arise.
- Bachelor s or Master s degree in Computer Science, Data Science, or a related field.
- Minimum of 5 years of experience as a Data Engineer, with a strong focus on cloud-based data infrastructure.
- Proficient programming skills in Python, Java, or a similar language, with an emphasis on Python.
- Extensive experience with cloud-based data storage and processing technologies, particularly Azure, AWS and GCP.
- Familiarity with ETL/ELT tools and frameworks such as Apache Beam, Apache Spark, or Apache Flink.
- Knowledge of data modeling principles and experience working with SQL databases.
- Strong problem-solving skills and the ability to troubleshoot and resolve issues efficiently.
- Excellent communication and collaboration skills to work effectively with cross-functional teams.
- Location: True Digital Park, Bangkok (Hybrid working).
ประสบการณ์:
5 ปีขึ้นไป
ทักษะ:
AutoCAD, Visio, English
ประเภทงาน:
งานประจำ
เงินเดือน:
สามารถต่อรองได้
- Responsible for planning preventive maintenance schedules for the electrical system.
- Responsible for coordinating and managing vendors and suppliers to preventive maintenance and payment plans.
- 2nd Level support to Data Center Operation (FOC), on site to solve Incident and Problem management.
- 2nd Level support to engineer team all site, Data Center (TT1, TT2, MTG, BNA).
- To create & update reports and documents to comply with ISO 20k, 22k, 27k, 50k & TCOS standards.
- Review PUE, cost saving energy and report.
- Measured Efficiency air system and record annual report.
- Responsible for implementing Electrical such as MU, TR, MDB, GEN, UPS, RECT, BATT, ATS.
- Bachelor degree of Engineering, Electrical engineering or related field.
- More than 5 years of experience in maintenance of electrical systems such as RMU, TR, MDB, GEN, UPS, RECT, BATT, ATS: implement and support electrical systems in buildings or Data Centers.
- At least 1 years experience in designing electrical systems (such as RMU, TR, MDB, GEN, UPS, RECT, BATT, ATS). implement, and support for electrical systems in building.
- Able to use the program AutoCAD, Visio.
- Able to work as a team and work in and standby on call on holiday.
- Able to work overtime if required and a hotline arrives (Less than 1 hour on site from your home).
- Proficiency in English communication is beneficial for both reading and writing.
- Work Location: TrueIDC - Bangna Site (KM26).
ประสบการณ์:
5 ปีขึ้นไป
ทักษะ:
Contracts, Compliance, Project Management, English
ประเภทงาน:
งานประจำ
เงินเดือน:
สามารถต่อรองได้
- Oversee and coordinate the maintenance and operation of critical infrastructure systems, including but not limited to HV and LV distribution systems, associated plant/equipment, HVAC mechanical cooling/heating systems, fire protection and suppression systems, and electrical and mechanical systems within the portfolio of buildings.
- Manage maintenance contracts and monitor contractor performance to ensure compliance with service level agreements.
- Provide technical support to the maintenance and operations teams, ensuring that all ...
- Monitor and optimize the performance of critical infrastructure systems, ensuring that they operate at peak efficiency and reliability.
- Develop and implement maintenance programs and procedures to ensure that critical infrastructure systems are maintained in accordance with best practice standards.
- Prepare and maintain accurate records of all maintenance and engineering work, including maintenance schedules, work orders, and engineering drawings.
- Develop and maintain relationships with key stakeholders, including Facilities Managers, the Client's staff and representatives, contractors, and suppliers.
- Participate in emergency call-out roster providing cover for weekend and team member absences, as required.
- Volunteer ideas/initiatives that contribute to the service levels and delivery.
- Undertake other tasks, as required by the Client, in accordance with experience and competencies.
- Bachelor's degree in Mechanical/Electrical Engineering or related field.
- Minimum of 5 years of experience in critical environment or data centre operations and maintenance.
- Experience in managing maintenance contracts and monitoring contractor performance.
- Strong technical knowledge of critical infrastructure systems, including but not limited to HV and LV distribution systems, associated plant/equipment, HVAC mechanical cooling/heating systems, fire protection and suppression systems, and electrical and mechanical systems.
- Excellent problem-solving skills, with the ability to identify, diagnose and solve technical issues.
- Excellent English & Thai communication skills, with the ability to communicate technical information to non-technical stakeholders.
- Strong project management skills, with the ability to manage multiple projects simultaneously.
- Knowledge of safety and environmental regulations and standards.
- Ability to work under pressure and in a fast-paced environment.
ประสบการณ์:
8 ปีขึ้นไป
ทักษะ:
Scala, Java, Golang
ประเภทงาน:
งานประจำ
เงินเดือน:
สามารถต่อรองได้
- Lead the team technically in improving scalability, stability, accuracy, speed and efficiency of our existing Data systems.
- Build, administer and scale data processing pipelines.
- Be comfortable navigating the following technology stack: Scala, Spark, java, Golang, Python3, scripting (Bash/Python), Hadoop, SQL, S3 etc.
- Improve scalability, stability, accuracy, speed and efficiency of our existing data systems.
- Design, build, test and deploy new libraries, frameworks or full systems for our core systems while keeping to the highest standards of testing and code quality.
- Work with experienced engineers and product owners to identify and build tools to automate many large-scale data management / analysis tasks.
- What You'll need to Succeed.
- Bachelor's degree in Computer Science /Information Systems/Engineering/related field.
- 8+ years of experience in software and data engineering.
- Good experience in Apache Spark.
- Expert level understanding of JVM and either Java or Scala.
- Experience debugging and reasoning about production issues is desirable.
- A good understanding of data architecture principles preferred.
- Any other experience with Big Data technologies / tools.
- SQL experience.
- Analytical problem-solving capabilities & experience.
- Systems administration skills in Linux.
- It's great if you have.
- Good understanding of Hadoop ecosystems.
- Experience working with Open-source products.
- Python/Shell scripting skills.
- Working in an agile environment using test driven methodologies.
- Equal Opportunity Employer.
- At Agoda, we pride ourselves on being a company represented by people of all different backgrounds and orientations. We prioritize attracting diverse talent and cultivating an inclusive environment that encourages collaboration and innovation. Employment at Agoda is based solely on a person's merit and qualifications. We are committed to providing equal employment opportunity regardless of sex, age, race, color, national origin, religion, marital status, pregnancy, sexual orientation, gender identity, disability, citizenship, veteran or military status, and other legally protected characteristics.
- We will keep your application on file so that we can consider you for future vacancies and you can always ask to have your details removed from the file. For more details please read our privacy policy.
- To all recruitment agencies: Agoda does not accept third party resumes. Please do not send resumes to our jobs alias, Agoda employees or any other organization location. Agoda is not responsible for any fees related to unsolicited resumes.
ทักษะ:
Big Data, Java, Python
ประเภทงาน:
งานประจำ
เงินเดือน:
สามารถต่อรองได้
- Background in programming, databases and/or big data technologies OR.
- BS/MS in software engineering, computer science, economics or other engineering fields.
- Partner with Data Architect and Data Integration Engineer to enhance/maintain optimal data pipeline architecture aligned to published standards.
- Assemble medium, complex data sets that meet functional /non-functional business requirements.
- Design and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
- Build the infrastructure required for optimal extraction transformation, and loading of data from a wide variety of data sources big data technologies.
- Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics.
- Work with stakeholders including Domain leads, and Teams to assist with data-related technical issues and support their data infrastructure needs.
- Ensure technology footprint adheres to data security policies and procedures related to encryption, obfuscation and role based access.
- Create data tools for analytics and data scientist team members.
- Functional Competency.
- Knowledge of data and analytics framework supporting data lakes, warehouses, marts, reporting, etc.
- Defining data retention policies, monitoring performance and advising any necessary infrastructure changes based on functional and non-functional requirements.
- In depth knowledge of data engineering discipline.
- Extensive experience working with Big Data tools and building data solutions for advanced analytics.
- Minimum of 5+ years' hands-on experience with a strong data background.
- Solid programming skills in Java, Python and SQL.
- Clear hands-on experience with database systems - Hadoop ecosystem, Cloud technologies (e.g. AWS, Azure, Google), in-memory database systems (e.g. HANA, Hazel cast, etc) and other database systems - traditional RDBMS (e.g. Teradata, SQL Server, Oracle), and NoSQL databases (e.g. Cassandra, MongoDB, DynamoDB).
- Practical knowledge across data extraction and transformation tools - traditional ETL tools (e.g. Informatica, Ab Initio, Altryx) as well as more recent big data tools.
- Educational.
- Background in programming, databases and/or big data technologies OR.
- BS/MS in software engineering, computer science, economics or other engineering fields.
- Partner with Data Architect and Data Integration Engineer to enhance/maintain optimal data pipeline architecture aligned to published standards.
- Assemble medium, complex data sets that meet functional /non-functional business requirements.
- Design and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
- Build the infrastructure required for optimal extraction transformation, and loading of data from a wide variety of data sources big data technologies.
- Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics.
- Work with stakeholders including Domain leads, and Teams to assist with data-related technical issues and support their data infrastructure needs.
- Ensure technology footprint adheres to data security policies and procedures related to encryption, obfuscation and role based access.
- Create data tools for analytics and data scientist team members.
- Functional Competency.
- Knowledge of data and analytics framework supporting data lakes, warehouses, marts, reporting, etc.
- Defining data retention policies, monitoring performance and advising any necessary infrastructure changes based on functional and non-functional requirements.
- In depth knowledge of data engineering discipline.
- Extensive experience working with Big Data tools and building data solutions for advanced analytics.
- Minimum of 5+ years' hands-on experience with a strong data background.
- Solid programming skills in Java, Python and SQL.
- Clear hands-on experience with database systems - Hadoop ecosystem, Cloud technologies (e.g. AWS, Azure, Google), in-memory database systems (e.g. HANA, Hazel cast, etc) and other database systems - traditional RDBMS (e.g. Teradata, SQL Server, Oracle), and NoSQL databases (e.g. Cassandra, MongoDB, DynamoDB).
- Practical knowledge across data extraction and transformation tools - traditional ETL tools (e.g. Informatica, Ab Initio, Altryx) as well as more recent big data tools.
ทักษะ:
Automation, Product Owner, Python
ประเภทงาน:
งานประจำ
เงินเดือน:
สามารถต่อรองได้
- The candidate will be responsible for design, and implement new solutions for complex data ingestions from multiple sources to enterprise data products with focus on automation, performance, resilience and scalability, etc.
- Partner with Lead Architect, Data Product Manager (Product Owner) and Lead Data Integration Engineer to create strategic solutions introducing new technologies.
- Work with stakeholders including Management, Domain leads, and Teams to assist with data-related technical issues and support their data infrastructure needs.
- Strong development & programming experience in Informatica (IICS), Python, ADF, Azure Synapse, snowflake, Cosmos and Databricks.
- Solid understanding of databases, real-time integration patterns and ETL/ELT best practices.
- Defining data retention policies, monitoring performance and advising any necessary infrastructure changes based on functional and non-functional requirements.
- Responsible for ensuring enterprise data policies, best practices, standards and processes are followed.
- Write up and maintain technical specifications, design documents and process flow.
- Mentor a team of onshore and offshore development resources to analyze, design, construct and test software development projects focused on analytics and data integration.
- Elaborate user stories for technical team and ensure that the team understands the deliverables.
- Effectively communicate, coordinate & collaborate with business, IT architecture and data teams across multi-functional areas to complete deliverables.
- Provide direction to the Agile development team and stakeholders throughout the project.
- Assist in Data Architecture design, tool selection and data flows analysis.
- Work with large amounts of data, interpret data, analyze results, perform gap analysis and provide ongoing reports.
- Handle ad-hoc analysis & report generation requests from the business.
- Respond to data related inquiries to support business and technical teams.
- 6+years of proven working experience in ETL methodologies, Data integration and data migration. Informatica IICS, Databricks/Spark & Python hands-on development skills a must.
- Clear hands-on experience with database systems - SQL server, Oracle, Azure Synapse, Snowflake and Cosmos, Cloud technologies (e.g., AWS, Azure, Google), and NoSQL databases (e.g., Cosmos, MongoDB, DynamoDB).
- Extensive experience developing complex solutions focused on data ecosystem solutions.
- Extensive knowledge of data and analytics framework supporting data lakes, warehouses, marts, reporting, etc.
- In depth knowledge of data engineering and architecture disciplines.
- Extensive experience working with Big Data tools and building data solutions for advanced analytics.
- Solid understanding of P&C Insurance data.
- Technical expertise regarding data architecture, models and database design development.
- Strong knowledge of and experience with Java, SQL, XML s, Python, ETL frameworks and Databricks.
- Working knowledge/familiarity with Git version control.
- Strong Knowledge of analyzing datasets using Excel.
- Strong analytical skills with the ability to collect, organize, analyze, and disseminate significant amounts of information with attention to detail and accuracy.
- Proficient in learning new technologies with the ability to quickly understand capabilities and work with others to guide these into development.
- Good communication and presentation skills.
- Solid problem solving, decision making and analytical skills.
- Knowledge & working experience with Duck creek is an added plus.
- Knowledge & working experience with Insurity Policy Decisions and/or IEV is an added plus.
- Experience with JIRA.
- Experience being part of high-performance agile teams in a fast-paced environment.
- Must understand the system scope and project objectives to achieve project needs through matrix management and collaboration with other enterprise teams.
- Proven ability to produce results in the analysis, design, testing and deployment of applications.
- Strong team emphasis and relationship building skills; partners well with business and other IT/Data areas.
- Strong coaching / mentoring skills.
- Applies technical knowledge to determine solutions and solve complex problems.
- Ability to be proactive, self-motivated, detail-oriented, creative, inquisitive and persistent.
- Excellent communication and negotiation skills.
- Ability to organize, plan and implement work assignments, juggle competing demands and work under pressure of frequent and tight deadlines.
- The candidate will be responsible for design, and implement new solutions for complex data ingestions from multiple sources to enterprise data products with focus on automation, performance, resilience and scalability, etc.
- Partner with Lead Architect, Data Product Manager (Product Owner) and Lead Data Integration Engineer to create strategic solutions introducing new technologies.
- Work with stakeholders including Management, Domain leads, and Teams to assist with data-related technical issues and support their data infrastructure needs.
- Strong development & programming experience in Informatica (IICS), Python, ADF, Azure Synapse, snowflake, Cosmos and Databricks.
- Solid understanding of databases, real-time integration patterns and ETL/ELT best practices.
- Defining data retention policies, monitoring performance and advising any necessary infrastructure changes based on functional and non-functional requirements.
- Responsible for ensuring enterprise data policies, best practices, standards and processes are followed.
- Write up and maintain technical specifications, design documents and process flow.
- Mentor a team of onshore and offshore development resources to analyze, design, construct and test software development projects focused on analytics and data integration.
- Elaborate user stories for technical team and ensure that the team understands the deliverables.
- Effectively communicate, coordinate & collaborate with business, IT architecture and data teams across multi-functional areas to complete deliverables.
- Provide direction to the Agile development team and stakeholders throughout the project.
- Assist in Data Architecture design, tool selection and data flows analysis.
- Work with large amounts of data, interpret data, analyze results, perform gap analysis and provide ongoing reports.
- Handle ad-hoc analysis & report generation requests from the business.
- Respond to data related inquiries to support business and technical teams.
- 6+years of proven working experience in ETL methodologies, Data integration and data migration. Informatica IICS, Databricks/Spark & Python hands-on development skills a must.
- Clear hands-on experience with database systems - SQL server, Oracle, Azure Synapse, Snowflake and Cosmos, Cloud technologies (e.g., AWS, Azure, Google), and NoSQL databases (e.g., Cosmos, MongoDB, DynamoDB).
- Extensive experience developing complex solutions focused on data ecosystem solutions.
- Extensive knowledge of data and analytics framework supporting data lakes, warehouses, marts, reporting, etc.
- In depth knowledge of data engineering and architecture disciplines.
- Extensive experience working with Big Data tools and building data solutions for advanced analytics.
- Solid understanding of P&C Insurance data.
- Technical expertise regarding data architecture, models and database design development.
- Strong knowledge of and experience with Java, SQL, XML s, Python, ETL frameworks and Databricks.
- Working knowledge/familiarity with Git version control.
- Strong Knowledge of analyzing datasets using Excel.
- Strong analytical skills with the ability to collect, organize, analyze, and disseminate significant amounts of information with attention to detail and accuracy.
- Proficient in learning new technologies with the ability to quickly understand capabilities and work with others to guide these into development.
- Good communication and presentation skills.
- Solid problem solving, decision making and analytical skills.
- Knowledge & working experience with Duck creek is an added plus.
- Knowledge & working experience with Insurity Policy Decisions and/or IEV is an added plus.
- Experience with JIRA.
- Experience being part of high-performance agile teams in a fast-paced environment.
- Must understand the system scope and project objectives to achieve project needs through matrix management and collaboration with other enterprise teams.
- Proven ability to produce results in the analysis, design, testing and deployment of applications.
- Strong team emphasis and relationship building skills; partners well with business and other IT/Data areas.
- Strong coaching / mentoring skills.
- Applies technical knowledge to determine solutions and solve complex problems.
- Ability to be proactive, self-motivated, detail-oriented, creative, inquisitive and persistent.
- Excellent communication and negotiation skills.
- Ability to organize, plan and implement work assignments, juggle competing demands and work under pressure of frequent and tight deadlines.
ประสบการณ์:
2 ปีขึ้นไป
ทักษะ:
Research, Python, SQL
ประเภทงาน:
งานประจำ
เงินเดือน:
สามารถต่อรองได้
- Develop machine learning models such as credit model, income estimation model and fraud model.
- Research on cutting-edge technology to enhance existing model performance.
- Explore and conduct feature engineering on existing data set (telco data, retail store data, loan approval data).
- Develop sentimental analysis model in order to support collection strategy.
- Bachelor Degree in Computer Science, Operations Research, Engineering, or related quantitative discipline.
- 2-5 years of experiences in programming languages such as Python, SQL or Scala.
- 5+ years of hands-on experience in building & implementing AI/ML solutions for senior role.
- Experience with python libraries - Numpy, scikit-learn, OpenCV, Tensorflow, Pytorch, Flask, Django.
- Experience with source version control (Git, Bitbucket).
- Proven knowledge on Rest API, Docker, Google Big Query, VScode.
- Strong analytical skills and data-driven thinking.
- Strong understanding of quantitative analysis methods in relation to financial institutions.
- Ability to clearly communicate modeling results to a wide range of audiences.
- Nice to have.
- Experience in image processing or natural language processing (NLP).
- Solid understanding in collection model.
- Familiar with MLOps concepts.
ทักษะ:
ISO 27001, English
ประเภทงาน:
งานประจำ
เงินเดือน:
สามารถต่อรองได้
- รับผิดชอบการ Monitoring ควบคุมและจัดการระบบพื้นฐานเกี่ยวกับ ไฟฟ้า และระบบปรับอากาศ ระบบเครือข่าย เพื่อสนับสนุนการจัดการ.
- ตอบสนองความต้องการของลูกค้า และประสานงาน การติดตั้งและการแก้ไขปัญหาระบบของผู้บริการ (vendor) เพื่อให้ถูกต้องและสมบูรณ์ตามหลักปฎิบัติ.
- ควบคุมและประสานงานการบำรุงรักษาและการซ่อมแซม (Preventive Maintenance) ระบบพื้นฐานต่างๆ เครื่องกำเนิดไฟฟ้า Generator, เครื่องสำรองไฟฟ้า UPS, ระบบตู้ไฟฟ้า, ระบบปรับอากาศ และการติดตั้งอุปกรณ์ระบบเครือข่าย (Network) เป็นต้น.
- เป็น 1st level support & troubleshooting ของระบบ Facility ใน Data Center เช่น ระบบ Network, ระบบไฟฟ้า, ระบบปรับอากาศ เป็นต้น.
- จัดทำกระบวนการปฎิบัติงาน และคู่มือการทำงานในการดูแลระบบพื้นฐาน โดยอิงตามมาตราฐาน ISO หรือมาตรฐานอื่นที่เกี่ยวข้องกับการปฏิบัติงาน (เช่น ISO 20000 ด้านบริการ, ISO 27001 ด้านความปลอดภัย,ISO 50001 ด้านบริหารพลังงาน และอื่นๆ เช่น ISO22301, PCIDSS, TCOS) รวมทั้งรูปแบบใบบันทึก, รายงานต่าง ๆ.
- สรุปและรายงานผลสำหรับปัญหาวิกฤติใด ๆ ต่อหัวหน้าทีม รวมทั้ง การจัดทำรายงานสถิติ,รายงานวิเคราะห์แบบรายวัน, รายเดือน รายไตรมาส ด้วย.
- Bachelor s degree in electrical power, mechanic or related fields.
- Thai nationality, Male, Age 20 - 25 years old.
- Have basic technical knowledge in Data Center facilities (Electrical/Mechanical).
- Able to work under pressure.
- Able to work with a team.
- Fair communication in English.
ประสบการณ์:
3 ปีขึ้นไป
ทักษะ:
Kubernetes, Automation, Redis
ประเภทงาน:
งานประจำ
เงินเดือน:
สามารถต่อรองได้
- Platform Operations: Manage and operate our Kubernetes platform, ensuring high availability, performance, and security.
- Automation & Tooling: Design, develop, and implement automation solutions for operational tasks, infrastructure provisioning, and application deployment.
- Observability: Build and maintain a comprehensive observability stack (monitoring, logging,tracing) to proactively identify and resolve issues.
- Platform Stability & Performance: Implement and maintain proactive measures to ensure platform stability, performance optimization, and capacity planning.
- Middleware Expertise: Provide support and expertise for critical middleware tools such as RabbitMQ, Redis, and Kafka, ensuring their optimal performance and reliability.
- Incident Response: Participate in our on-call rotation, troubleshoot and resolve production incidents efficiently, and implement preventative measures.
- Collaboration: Collaborate effectively with development and other engineering teams.
- Positive attitude and empathy for others.
- Passion for developing and maintaining reliable, scalable infrastructure.
- A minimum of 3 years working experience in relevant areas.
- Experience in managing and operating Kubernetes in a production environment.
- Experienced with cloud platforms like AWS or GCP.
- Experienced with high availability, high-scale, and performance systems.
- Understanding of cloud-native architectures.
- Experienced with DevSecOps practices.
- Strong scripting and automation skills using languages like Python, Bash, or Go.
- Proven experience in building and maintaining CI/CD pipelines (e.g., Jenkins, GitLab CI).
- Deep understanding of monitoring, logging, and tracing tools and techniques.
- Experience with infrastructure-as-code tools (e.g., Terraform, Ansible).
- Strong understanding of Linux systems administration and networking concepts.
- Experience working with middleware technologies like RabbitMQ, Redis, and Kafka.
- Excellent problem-solving and troubleshooting skills.
- Excellent communication and collaboration skills.
- Strong interest and ability to learn any new technical topic.
- Experience with container security best practices.
- Experience with chaos engineering principles and practices.
- Experience in the Financial Services industry.
- Opportunity to tackle challenging projects in a dynamic environment.
ประสบการณ์:
5 ปีขึ้นไป
ทักษะ:
Research, Statistics, Finance, English
ประเภทงาน:
งานประจำ
เงินเดือน:
สามารถต่อรองได้
- Develop, maintain, and calibrate existing quantitative risk models, including provisioning models and credit scoring tailored to various portfolio types and financial institutions.
- Perform both conceptual and quantitative reviews of models, including validation, using programming scripts or automated tools.
- Provide business insights on post-model adjustments, such as management overlays.
- Research risk management topics and stay updated on recent industry developments.
- Prepare comprehensive model documentation, reports, or presentations to communicate methodologies and results to clients.
- Effectively convey observations, results, thoughts, and initiatives to client stakeholders in both Thai and English through proficient presentation during virtual and in-person meetings as needed.
- Propose innovative ideas to enhance team efficiency and effectiveness.
- Collaborate with colleagues and clients across multiple countries, primarily within Southeast Asia.
- Support partners and directors in preparing client proposals under tight deadlines.
- Mentor and onboard junior staff, ensuring the delivery of high-quality work.
- You will be expected to communicate closely with senior management and client personnel; assist in proposal development; mentor and develop junior team members; and maintain up-to-date knowledge of financial risk management methodologies, current corporate governance and regulatory developments/requirements, both locally and internationally
- Your role as a leader:At Deloitte, we believe in the importance of empowering our people to be leaders at all levels. We connect our purpose and shared values to identify issues as well as to make an impact that matters to our clients, people and the communities. Additionally, Senior Associates / Senior Consultants / Assistant Managers across our Firm are expected to:Actively seek out developmental opportunities for growth, act as strong brand ambassadors for the firm as well as share their knowledge and experience with others.
- Respect the needs of their colleagues and build up cooperative relationships.
- Understand the goals of our internal and external stakeholder to set personal priorities as well as align their teams work to achieve the objectives.
- Constantly challenge themselves, collaborate with others to deliver on tasks and take accountability for the results.
- Build productive relationships and communicate effectively in order to positively influence teams and other stakeholders.
- Offer insights based on a solid understanding of what makes Deloitte successful.
- Project integrity and confidence while motivating others through team collaboration as well as recognising individual strengths, differences, and contributions.
- Understand disruptive trends and promote potential opportunities for improvement.
- You are someone with:A degree, preferably in technical engineering, statistics, economics, mathematics, finance, accountancy, or a related field.
- Possess a minimum of 5 years of relevant work experience. A background in banking or financial institutions is preferred, but this can be supplemented with significant knowledge of the financial markets and banking industry.
- Strong knowledge of risk management, with a focus on one of the risk domains namely credit risk, market risk, operational risk and climate risk preferred.
- Ability to work independently and collaboratively with a diverse range of staff on qualitative and quantitative risk management in multitasking and cross-country settings.
- Proficient in data analytics or statistical analysis tools (i.e., Python and SAS), with advanced Excel skills.
- Experience in mentoring and coaching at least 2-3 junior team members.
- Proficient in business-level English, with the ability to communicate ideas and prepare professional client presentations.
- Due to volume of applications, we regret only shortlisted candidates will be notified.
- Please note that Deloitte will never reach out to you directly via messaging platforms to offer you employment opportunities or request for money or your personal information. Kindly apply for roles that you are interested in via this official Deloitte website.Requisition ID: 105622In Thailand, the services are provided by Deloitte Touche Tohmatsu Jaiyos Co., Ltd. and other related entities in Thailand ("Deloitte in Thailand"), which are affiliates of Deloitte Southeast Asia Ltd. Deloitte Southeast Asia Ltd is a member firm of Deloitte Touche Tohmatsu Limited. Deloitte in Thailand, which is within the Deloitte Network, is the entity that is providing this Website.
ประสบการณ์:
4 ปีขึ้นไป
ทักษะ:
Business Development, Statistics, Finance, English
ประเภทงาน:
งานประจำ
เงินเดือน:
สามารถต่อรองได้
- Be part of an engagement advisory team to Develop/Validate/Enhance Credit Risk models (e.g. IFRS 9 ECL Model, Credit Scoring / Scorecard, and Credit Rating) based on industry best practices. You will also be able to learn and work in other quantitative and analytical financial risk areas such as model risk management, business intelligence, machine learning and artificial intelligence.
- Assist in managing / driving the project, team, and client servicing.
- Involve in business development initiatives in the aforementioned areas.
- You will be expected to communicate closely with senior management and client personnel; assist in proposal development; mentor and develop junior team members; and maintain up-to-date knowledge of financial risk management methodologies, current corporate governance and regulatory developments/requirements, both locally and internationally
- Your role as a leader:At Deloitte, we believe in the importance of empowering our people to be leaders at all levels. We connect our purpose and shared values to identify issues as well as to make an impact that matters to our clients, people and the communities. Additionally, Senior Associates / Senior Consultants / Assistant Managers across our Firm are expected to:Actively seek out developmental opportunities for growth, act as strong brand ambassadors for the firm as well as share their knowledge and experience with others.
- Respect the needs of their colleagues and build up cooperative relationships.
- Understand the goals of our internal and external stakeholder to set personal priorities as well as align their teams work to achieve the objectives.
- Constantly challenge themselves, collaborate with others to deliver on tasks and take accountability for the results.
- Build productive relationships and communicate effectively in order to positively influence teams and other stakeholders.
- Offer insights based on a solid understanding of what makes Deloitte successful.
- Project integrity and confidence while motivating others through team collaboration as well as recognising individual strengths, differences, and contributions.
- Understand disruptive trends and promote potential opportunities for improvement.
- You are someone with:4-8 years of relevant experience spent within a credit risk model development or model validation team at major banks / financial institutions or consulting firms.
- Solid academic background with a Degree in Statistics, Data Science / AI, Financial Engineering, Quantitative Finance, or other relevant post graduate degree.
- Solid knowledge of common practices in credit risk models, including IFRS 9 expected credit losses (PD, LGD, EAD) and credit scoring / scorecard (Application, Behavioural, Credit Rating) methodologies.
- Solid knowledge of supervisory/regulatory requirements as it pertains to credit risk models, including IFRS 9 ECL and Basel.
- Foundation knowledge in statistics and machine learning (e.g. Classification, Regression, Clustering, Hypothesis testing).
- Hands-on data processing, reporting/visualization and modelling skill in pertinent languages such as Python, R, SAS, and Excel(VBA).
- Strong critical thinking and analytical problem-solving abilities.
- Ability to communicate complex quantitative analysis in a clear, precise manner.
- Proficiency in English & Thai.
- For male, Certificate of Military Exemption is a must.
- We offer the successful candidate an attractive remuneration package and the opportunity to work in a dynamic and exciting environment Successful candidates will have an opportunity to develop their technical knowledge of the financial risk management as well to work with a number of high profile both local and international financial institutions.
- Due to volume of applications, we regret only shortlisted candidates will be notified.
- Please note that Deloitte will never reach out to you directly via messaging platforms to offer you employment opportunities or request for money or your personal information. Kindly apply for roles that you are interested in via this official Deloitte website.Requisition ID: 101624In Thailand, the services are provided by Deloitte Touche Tohmatsu Jaiyos Co., Ltd. and other related entities in Thailand ("Deloitte in Thailand"), which are affiliates of Deloitte Southeast Asia Ltd. Deloitte Southeast Asia Ltd is a member firm of Deloitte Touche Tohmatsu Limited. Deloitte in Thailand, which is within the Deloitte Network, is the entity that is providing this Website.
- 1
- 2