- No elements found. Consider changing the search query.
Skills:
SQL, Oracle, Data Warehousing
Job type:
Full-time
Salary:
negotiable
- Bachelor s degree in Computer Science, Information Systems, Engineering, or a related field.
- At least 7 years of experience as a Data Engineer or in a related role.
- Hands-on experience with SQL, database management (e.g., Oracle, SQL Server, PostgreSQL), and data warehousing concepts.
- Experience with ETL/ELT tools such as Talend, Apache NiFi, or similar.
- Proficiency in programming languages like Python, Java, or Scala for data manipulation and automation.
- Experience with cloud platforms such as AWS, Azure, or GCP.
- Knowledge of big data technologies such as Hadoop, Spark, or Kafka.
- Strong understanding of data governance, security, and privacy frameworks in a financial services context.
- Excellent problem-solving skills and attention to detail.
- Experience working with Data Visualization or BI tools like Power BI, Tableau.
- Familiarity with machine learning concepts, model deployment, and AI applications.
- Banking or financial services industry experience, especially in retail or wholesale banking data solutions.
- Certification in cloud platforms (e.g., AWS Certified Data Engineer, Microsoft Azure Data Engineer, Google Professional Data Engineer)..
- Contact:.
- ท่านสามารถอ่านและศึกษานโยบายความเป็นส่วนตัวของธนาคารกรุงไทย จำกัด (มหาชน) ที่ https://krungthai.com/th/content/privacy-policy ทั้งนี้ ธนาคารไม่มีเจตนาหรือความจำเป็นใดๆ ที่จะประมวลผลข้อมูลส่วนบุคคลที่มีความอ่อนไหว รวมถึงข้อมูลที่เกี่ยวข้องศาสนาและ/หรือหมู่โลหิต ซึ่งอาจปรากฏอยู่ในสำเนาบัตรประจำตัวประชาชนของท่านแต่อย่างใด ดังนั้น กรุณาอย่าอัปโหลดเอกสารใดๆ รวมถึงสำเนาบัตรประจำตัวประชาชน หรือกรอกข้อมูลส่วนบุคคลที่มีความอ่อนไหวหรือข้อมูลอื่นใด ซึ่งไม่เกี่ยวข้องหรือไม่จำเป็นสำหรับวัตถุประสงค์ในการสมัครงานไว้บนเว็บไซต์ นอกจากนี้ กรุณาดำเนินการให้แน่ใจว่าได้ดำเนินการลบข้อมูลส่วนบุคคลที่มีความอ่อนไหว (ถ้ามี) ออกจากเรซูเม่และเอกสารอื่นใดก่อนที่จะอัปโหลดเอกสารดังกล่าวไว้บนเว็บไซต์แล้วด้วย ทั้งนี้ ธนาคารมีความจำเป็นต้องเก็บรวบรวมข้อมูลส่วนบุคคลเกี่ยวกับประวัติอาชญากรรมของท่านเพื่อบรรลุวัตถุประสงค์ในการพิจารณารับบุคคลเข้าทำงาน หรือการตรวจสอบคุณสมบัติ ลักษณะต้องห้าม หรือพิจารณาความเหมาะสมของบุคคลที่จะให้ดำรงตำแหน่ง ซึ่งการให้ความยินยอมเพื่อเก็บรวบรวม ใช้ หรือเปิดเผยข้อมูลส่วนบุคคลเกี่ยวกับประวัติอาชญากรรมของท่านมีความจำเป็นสำหรับการเข้าทำสัญญาและการได้รับการพิจารณาตามวัตถุประสงค์ดังกล่าวข้างต้น ในกรณีที่ท่านไม่ให้ความยินยอมในการเก็บรวบรวม ใช้ หรือเปิดเผยข้อมูลส่วนบุคคลเกี่ยวกับประวัติอาชญากรรม หรือมีการถอนความยินยอมในภายหลัง ธนาคารอาจไม่สามารถดำเนินการเพื่อบรรลุวัตถุประสงค์ดังกล่าวข้างต้นได้ และอาจ ทำให้ท่านสูญเสียโอกาสในการได้รับการพิจารณารับเข้าทำงานกับธนาคาร .
Experience:
3 years required
Skills:
Big Data, Hive, SAS
Job type:
Full-time
Salary:
negotiable
- Design, implement, and maintain data analytics pipelines and processing systems.
- Experience of data modelling techniques and integration patterns.
- Write data transformation jobs through code.
- Analyze large datasets to extract insights and identify trends.
- Perform data management through data quality tests, monitoring, cataloging, and governance.
- Knowledge in data infrastructure ecosystem.
- Collaborate with cross-functional teams to identify opportunities to leverage data to drive business outcomes.
- Build data visualizations to communicate findings to stakeholders.
- A willingness to learn and find solutions to complex problems.
- Stay up-to-date with the latest developments in data analytics and science.
- Experience migrating from on-premise data stores to cloud solutions.
- Knowledge of system design and platform thinking to build sustainable solution.
- Practical experience with modern and traditional Big Data stacks (e.g BigQuery, Spark, Databricks, duckDB, Impala, Hive, etc).
- Experience working with data warehouse solutions ELT solutions, tools, and techniques (e.g. Airflow, dbt, SAS, Matillion, Nifi).
- Experience with agile software delivery and CI/CD processes.
- Bachelor's or Master's degree in computer science, statistics, engineering, or a related field.
- At least 3 years of experience in data analysis and modeling.
- Proficiency in Python, and SQL.
- Experience with data visualization tools such as Tableau, Grafana or similar.
- Familiarity with cloud computing platforms, such as GCP, AWS or Databricks.
- Strong problem-solving skills and the ability to work independently as well as collaboratively.
- This role offers a clear path to advance into machine learning and AI with data quality and management, providing opportunities to work on innovative projects and develop new skills in these exciting fields..
- Contact: [email protected] (K.Thipwimon).
- ท่านสามารถอ่านและศึกษานโยบายความเป็นส่วนตัวของธนาคารกรุงไทย จำกัด (มหาชน) ที่ https://krungthai.com/th/content/privacy-policy ทั้งนี้ ธนาคารไม่มีเจตนาหรือความจำเป็นใดๆ ที่จะประมวลผลข้อมูลส่วนบุคคลที่มีความอ่อนไหว รวมถึงข้อมูลที่เกี่ยวข้องศาสนาและ/หรือหมู่โลหิต ซึ่งอาจปรากฏอยู่ในสำเนาบัตรประจำตัวประชาชนของท่านแต่อย่างใด ดังนั้น กรุณาอย่าอัปโหลดเอกสารใดๆ รวมถึงสำเนาบัตรประจำตัวประชาชน หรือกรอกข้อมูลส่วนบุคคลที่มีความอ่อนไหวหรือข้อมูลอื่นใด ซึ่งไม่เกี่ยวข้องหรือไม่จำเป็นสำหรับวัตถุประสงค์ในการสมัครงานไว้บนเว็บไซต์ นอกจากนี้ กรุณาดำเนินการให้แน่ใจว่าได้ดำเนินการลบข้อมูลส่วนบุคคลที่มีความอ่อนไหว (ถ้ามี) ออกจากเรซูเม่และเอกสารอื่นใดก่อนที่จะอัปโหลดเอกสารดังกล่าวไว้บนเว็บไซต์แล้วด้วย ทั้งนี้ ธนาคารมีความจำเป็นต้องเก็บรวบรวมข้อมูลส่วนบุคคลเกี่ยวกับประวัติอาชญากรรมของท่านเพื่อบรรลุวัตถุประสงค์ในการพิจารณารับบุคคลเข้าทำงาน หรือการตรวจสอบคุณสมบัติ ลักษณะต้องห้าม หรือพิจารณาความเหมาะสมของบุคคลที่จะให้ดำรงตำแหน่ง ซึ่งการให้ความยินยอมเพื่อเก็บรวบรวม ใช้ หรือเปิดเผยข้อมูลส่วนบุคคลเกี่ยวกับประวัติอาชญากรรมของท่านมีความจำเป็นสำหรับการเข้าทำสัญญาและการได้รับการพิจารณาตามวัตถุประสงค์ดังกล่าวข้างต้น ในกรณีที่ท่านไม่ให้ความยินยอมในการเก็บรวบรวม ใช้ หรือเปิดเผยข้อมูลส่วนบุคคลเกี่ยวกับประวัติอาชญากรรม หรือมีการถอนความยินยอมในภายหลัง ธนาคารอาจไม่สามารถดำเนินการเพื่อบรรลุวัตถุประสงค์ดังกล่าวข้างต้นได้ และอาจ ทำให้ท่านสูญเสียโอกาสในการได้รับการพิจารณารับเข้าทำงานกับธนาคาร .
Experience:
3 years required
Skills:
Microsoft Azure, SQL, UNIX, Python, Hadoop
Job type:
Full-time
Salary:
negotiable
- Develop data pipeline automation using Azure technologies, Databricks and Data Factory.
- Understand data, reports and dashboards requirements, develop data visualization using Power BI, Tableau by working across workstreams to support data requirements including reports and dashboards and collaborate with data scientists, data analyst, data governance team or business stakeholders on several projects.
- Analyze and perform data profiling to understand data patterns following Data Qualit ...
- 3 years+ experience in big data technology, data engineering, data analytic application system development.
- Have an experience of unstructured data for business Intelligence or computer science would be advantage.
- Technical skill in SQL, UNIX and Shell Script, Python, R Programming, Spark, Hadoop programming.
Experience:
No experience required
Skills:
Mechanical Engineering, Electrical Engineering, English
Job type:
Full-time
- Provide day to day installation, maintenance, and repair of all facilities in the data center.
- 24x7 shift work responsibility when qualified and designated.
- Provide requested reporting and documentation.
- Support of facility, development, and construction teams.
- Perform tasks as assigned by DC operation manager.
- Respond to customer requests, power, cooling, and facility audits.
- First tier investigate any power, communication, or cooling anomalies.
- Attend assigned meetings and training.
- Assist in ensuring customer compliance with GSA Acceptance Usage Policy (AUP).
- Provide technical escort when needed.
- Job Qualifications.
- Must be familiar with safety requirements and OSHA regulations or Thailand safety regulations.
- Basic understanding of electrical and mechanical systems that may be employed in a data center environment. This may include electrical feeders, transformers, generators, switchgear, UPS systems, DC power systems, ATS/STS units, PDU units, air handling units, cooling towers, and fire suppression systems.
- Familiar with Interpret wiring diagrams, schematics, and electrical drawings.
- Ability to express ideas clearly, concisely, and effectively with contractors performing maintenance or upgrades on systems installed in the data center environment.
- Excellent verbal, written, and interpersonal communication skills.
- Ability to analyze and make suggestions for problem resolution.
- Solve problems with good initiative and sound judgment.
- Creativity, problem solving skills, negotiation and systematic thinking.
- Fluent in English both written and verbal (Minimum 500 TOEIC score).
- Goal-Oriented, Unity, Learning, Flexible.
Experience:
3 years required
Skills:
English
Job type:
Full-time
Salary:
negotiable
- Responsible for planning preventive maintenance schedules for air condition & Fire protection systems.
- Responsible for coordinating and managing vendors and suppliers to preventive maintenance and payment plans.
- 2nd Level support to Data Center Operation (FOC), on site to solve Incident and Problem management.
- 2nd Level support to engineer team all site, Data Center (TT1, TT2, MTG, BNA).
- To create & update reports and documents to comply with ISO 20k, 22k, 27k, 50k & TCOS standards.
- Review PUE, cost saving energy and report.
- Measured Efficiency air system and record annual report.
- Responsible for implementation of Mechanical such as comfort air, precision air.
- Responsible for implementation of Fire suppression such as FM200, NOVEC, CO2, Fire Sprinkler, Fire Pump, Fire alarm, and Vesda.
- Working period office time 9:00 - 18:00 and able to standby on call or onsite to work on holiday.
- Bachelor degree of Engineering, Mechanical engineering or related field.
- At Least 3 years experience in maintenance air conditioning such as comfort air, precision air, chiller air cool, water cool, pump motor: implement and support for mechanics-air condition systems in buildings or Data Centers.
- At least 1 years experience in designing air conditioners such as comfort air, precision air, chiller air cool, water cool, pump motor: implement, and support for mechanics-air condition systems in building.
- Able to Air - Diagram and Psychrometric chart knowledge.
- Able to work as a team and work in and standby on call on holiday.
- Able to work overtime if required and a hotline arrives (Less than 1 hour on site from your home).
- Proficiency in English communication is beneficial.
- Work Location: TrueIDC - Bangna Site (KM26).
Skills:
Data Analysis, SQL, Problem Solving, English
Job type:
Full-time
Salary:
negotiable
- Working closely with business and technical domain experts to identify data requirements that are relevant for analytics and business intelligence.
- Implement data solutions and data comprehensiveness for data customers.
- Working closely with engineering to ensure data service solutions are ultimately delivered in a timely and cost effective manner.
- Retrieve and prepare data (automated if possible) to support business data analysis.
- Ensure adherence to the highest standards in data privacy protection and data governance.
- Bachelor s of Master s Degree in Computer Science, Computer Engineering, or related.
- Minimum of 1 Year with relational/non-relational database systems and good command in SQL.
- Ability to meet critical deadlines and prioritize multiple tasks in a fast-paced environment.
- Ability to work independently, have strong problem solving and organization skills, with a high initiative and a sense of accountability and ownership.
- Experience with cloud-based platforms such as AWS, Google Cloud platform or similar.
- Experience in data processing batch / real time / nearly realtime.
- Experience with data integration or ETL management tools such as AWS Glue, Databrick,or similar.
- Experience with web or software development with Java,Python or similar.
- Experience with Agile methodology is a plus.
- Good in communication and writing in English.
- Good interpersonal and communication skills.
Experience:
2 years required
Skills:
Research, Python, SQL
Job type:
Full-time
Salary:
negotiable
- Develop machine learning models such as credit model, income estimation model and fraud model.
- Research on cutting-edge technology to enhance existing model performance.
- Explore and conduct feature engineering on existing data set (telco data, retail store data, loan approval data).
- Develop sentimental analysis model in order to support collection strategy.
- Bachelor Degree in Computer Science, Operations Research, Engineering, or related quantitative discipline.
- 2-5 years of experiences in programming languages such as Python, SQL or Scala.
- 5+ years of hands-on experience in building & implementing AI/ML solutions for senior role.
- Experience with python libraries - Numpy, scikit-learn, OpenCV, Tensorflow, Pytorch, Flask, Django.
- Experience with source version control (Git, Bitbucket).
- Proven knowledge on Rest API, Docker, Google Big Query, VScode.
- Strong analytical skills and data-driven thinking.
- Strong understanding of quantitative analysis methods in relation to financial institutions.
- Ability to clearly communicate modeling results to a wide range of audiences.
- Nice to have.
- Experience in image processing or natural language processing (NLP).
- Solid understanding in collection model.
- Familiar with MLOps concepts.
Skills:
ETL, Big Data, SQL
Job type:
Full-time
Salary:
negotiable
- Design and develop ETL solutions using data integration tools for interfacing between source application and the Enterprise Data Warehouse.
- Experience with Big Data or data warehouse.
- Analyze & translate functional specifications & change requests into technical specifications.
- Experience in SQL programming in one of these RDBMSs such as Oracle.
- Develops ETL technical specifications, designs, develops, tests, implements, and supports optimal data solutions.
- Develops Documents ETL data mappings, data dictionaries, processes, programs and solutions as per established standards for data governance.
- Design and create codes for all related data extraction, transformation and loading (ETL) into database under responsibilities.
- Creates, executes, and documents unit test plans for ETL and data integration processes and programs.
- Perform problem assessment, resolution and documentation in existing ETL packages, mapping and workflows in production.
- Performance tuning of the ETL process and SQL queries and recommend and implement ETL and query tuning techniques.
- Qualifications Bachelor s degree in Information Technology, Computer Science, Computer Engineering, or a related field.
- Experience with CI/CD tools (e.g., Jenkins, GitLab CI/CD) and automation frameworks.
- Proficiency in cloud platforms such as AWS, Azure, and associated services.
- Knowledge of IAC tools like Terraform or Azure ARM.
- Familiarity with monitoring and logging tools like Prometheus, Grafana, ELK stack, APM, etc.
- Good understanding of IT Operations.
- Strong problem-solving skills and the ability to troubleshoot complex issues.
- Excellent communication and teamwork skills to collaborate effectively across various teams.
- We're committed to bringing passion and customer focus to the business. If you like wild growth and working with happy, enthusiastic over-achievers, you'll enjoy your career with us.
Experience:
5 years required
Skills:
Contracts, Compliance, Project Management, English
Job type:
Full-time
Salary:
negotiable
- Oversee and coordinate the maintenance and operation of critical infrastructure systems, including but not limited to HV and LV distribution systems, associated plant/equipment, HVAC mechanical cooling/heating systems, fire protection and suppression systems, and electrical and mechanical systems within the portfolio of buildings.
- Manage maintenance contracts and monitor contractor performance to ensure compliance with service level agreements.
- Provide technical support to the maintenance and operations teams, ensuring that all ...
- Monitor and optimize the performance of critical infrastructure systems, ensuring that they operate at peak efficiency and reliability.
- Develop and implement maintenance programs and procedures to ensure that critical infrastructure systems are maintained in accordance with best practice standards.
- Prepare and maintain accurate records of all maintenance and engineering work, including maintenance schedules, work orders, and engineering drawings.
- Develop and maintain relationships with key stakeholders, including Facilities Managers, the Client's staff and representatives, contractors, and suppliers.
- Participate in emergency call-out roster providing cover for weekend and team member absences, as required.
- Volunteer ideas/initiatives that contribute to the service levels and delivery.
- Undertake other tasks, as required by the Client, in accordance with experience and competencies.
- Bachelor's degree in Mechanical/Electrical Engineering or related field.
- Minimum of 5 years of experience in critical environment or data centre operations and maintenance.
- Experience in managing maintenance contracts and monitoring contractor performance.
- Strong technical knowledge of critical infrastructure systems, including but not limited to HV and LV distribution systems, associated plant/equipment, HVAC mechanical cooling/heating systems, fire protection and suppression systems, and electrical and mechanical systems.
- Excellent problem-solving skills, with the ability to identify, diagnose and solve technical issues.
- Excellent English & Thai communication skills, with the ability to communicate technical information to non-technical stakeholders.
- Strong project management skills, with the ability to manage multiple projects simultaneously.
- Knowledge of safety and environmental regulations and standards.
- Ability to work under pressure and in a fast-paced environment.
Experience:
5 years required
Skills:
AutoCAD, Visio, English
Job type:
Full-time
Salary:
negotiable
- Responsible for planning preventive maintenance schedules for the electrical system.
- Responsible for coordinating and managing vendors and suppliers to preventive maintenance and payment plans.
- 2nd Level support to Data Center Operation (FOC), on site to solve Incident and Problem management.
- 2nd Level support to engineer team all site, Data Center (TT1, TT2, MTG, BNA).
- To create & update reports and documents to comply with ISO 20k, 22k, 27k, 50k & TCOS standards.
- Review PUE, cost saving energy and report.
- Measured Efficiency air system and record annual report.
- Responsible for implementing Electrical such as MU, TR, MDB, GEN, UPS, RECT, BATT, ATS.
- Bachelor degree of Engineering, Electrical engineering or related field.
- More than 5 years of experience in maintenance of electrical systems such as RMU, TR, MDB, GEN, UPS, RECT, BATT, ATS: implement and support electrical systems in buildings or Data Centers.
- At least 1 years experience in designing electrical systems (such as RMU, TR, MDB, GEN, UPS, RECT, BATT, ATS). implement, and support for electrical systems in building.
- Able to use the program AutoCAD, Visio.
- Able to work as a team and work in and standby on call on holiday.
- Able to work overtime if required and a hotline arrives (Less than 1 hour on site from your home).
- Proficiency in English communication is beneficial for both reading and writing.
- Work Location: TrueIDC - Bangna Site (KM26).
Skills:
ISO 27001, English
Job type:
Full-time
Salary:
negotiable
- รับผิดชอบการ Monitoring ควบคุมและจัดการระบบพื้นฐานเกี่ยวกับ ไฟฟ้า และระบบปรับอากาศ ระบบเครือข่าย เพื่อสนับสนุนการจัดการ.
- ตอบสนองความต้องการของลูกค้า และประสานงาน การติดตั้งและการแก้ไขปัญหาระบบของผู้บริการ (vendor) เพื่อให้ถูกต้องและสมบูรณ์ตามหลักปฎิบัติ.
- ควบคุมและประสานงานการบำรุงรักษาและการซ่อมแซม (Preventive Maintenance) ระบบพื้นฐานต่างๆ เครื่องกำเนิดไฟฟ้า Generator, เครื่องสำรองไฟฟ้า UPS, ระบบตู้ไฟฟ้า, ระบบปรับอากาศ และการติดตั้งอุปกรณ์ระบบเครือข่าย (Network) เป็นต้น.
- เป็น 1st level support & troubleshooting ของระบบ Facility ใน Data Center เช่น ระบบ Network, ระบบไฟฟ้า, ระบบปรับอากาศ เป็นต้น.
- จัดทำกระบวนการปฎิบัติงาน และคู่มือการทำงานในการดูแลระบบพื้นฐาน โดยอิงตามมาตราฐาน ISO หรือมาตรฐานอื่นที่เกี่ยวข้องกับการปฏิบัติงาน (เช่น ISO 20000 ด้านบริการ, ISO 27001 ด้านความปลอดภัย,ISO 50001 ด้านบริหารพลังงาน และอื่นๆ เช่น ISO22301, PCIDSS, TCOS) รวมทั้งรูปแบบใบบันทึก, รายงานต่าง ๆ.
- สรุปและรายงานผลสำหรับปัญหาวิกฤติใด ๆ ต่อหัวหน้าทีม รวมทั้ง การจัดทำรายงานสถิติ,รายงานวิเคราะห์แบบรายวัน, รายเดือน รายไตรมาส ด้วย.
- Bachelor s degree in electrical power, mechanic or related fields.
- Thai nationality, Male, Age 20 - 25 years old.
- Have basic technical knowledge in Data Center facilities (Electrical/Mechanical).
- Able to work under pressure.
- Able to work with a team.
- Fair communication in English.
Experience:
5 years required
Skills:
Scala, Java, Golang
Job type:
Full-time
Salary:
negotiable
- Lead the team technically in improving scalability, stability, accuracy, speed and efficiency of our existing Data systems.
- Build, administer and scale data processing pipelines.
- Be comfortable navigating the following technology stack: Scala, Spark, java, Golang, Python3, scripting (Bash/Python), Hadoop, SQL, S3 etc.
- Improve scalability, stability, accuracy, speed and efficiency of our existing data systems.
- Design, build, test and deploy new libraries, frameworks or full systems for our core systems while keeping to the highest standards of testing and code quality.
- Work with experienced engineers and product owners to identify and build tools to automate many large-scale data management / analysis tasks.
- What You'll need to Succeed.
- Bachelor's degree in Computer Science /Information Systems/Engineering/related field.
- 5+ years of experience in software and data engineering.
- Good experience in Apache Spark.
- Expert level understanding of JVM and either Java or Scala.
- Experience debugging and reasoning about production issues is desirable.
- A good understanding of data architecture principles preferred.
- Any other experience with Big Data technologies / tools.
- SQL experience.
- Analytical problem-solving capabilities & experience.
- Systems administration skills in Linux.
- It's great if you have.
- Good understanding of Hadoop ecosystems.
- Experience working with Open-source products.
- Python/Shell scripting skills.
- Working in an agile environment using test driven methodologies.
- Equal Opportunity Employer.
- At Agoda, we pride ourselves on being a company represented by people of all different backgrounds and orientations. We prioritize attracting diverse talent and cultivating an inclusive environment that encourages collaboration and innovation. Employment at Agoda is based solely on a person's merit and qualifications. We are committed to providing equal employment opportunity regardless of sex, age, race, color, national origin, religion, marital status, pregnancy, sexual orientation, gender identity, disability, citizenship, veteran or military status, and other legally protected characteristics.
- We will keep your application on file so that we can consider you for future vacancies and you can always ask to have your details removed from the file. For more details please read our privacy policy.
- To all recruitment agencies: Agoda does not accept third party resumes. Please do not send resumes to our jobs alias, Agoda employees or any other organization location. Agoda is not responsible for any fees related to unsolicited resumes.
Skills:
Big Data, Java, Python
Job type:
Full-time
Salary:
negotiable
- Background in programming, databases and/or big data technologies OR.
- BS/MS in software engineering, computer science, economics or other engineering fields.
- Partner with Data Architect and Data Integration Engineer to enhance/maintain optimal data pipeline architecture aligned to published standards.
- Assemble medium, complex data sets that meet functional /non-functional business requirements.
- Design and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
- Build the infrastructure required for optimal extraction transformation, and loading of data from a wide variety of data sources big data technologies.
- Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics.
- Work with stakeholders including Domain leads, and Teams to assist with data-related technical issues and support their data infrastructure needs.
- Ensure technology footprint adheres to data security policies and procedures related to encryption, obfuscation and role based access.
- Create data tools for analytics and data scientist team members.
- Functional Competency.
- Knowledge of data and analytics framework supporting data lakes, warehouses, marts, reporting, etc.
- Defining data retention policies, monitoring performance and advising any necessary infrastructure changes based on functional and non-functional requirements.
- In depth knowledge of data engineering discipline.
- Extensive experience working with Big Data tools and building data solutions for advanced analytics.
- Minimum of 5+ years' hands-on experience with a strong data background.
- Solid programming skills in Java, Python and SQL.
- Clear hands-on experience with database systems - Hadoop ecosystem, Cloud technologies (e.g. AWS, Azure, Google), in-memory database systems (e.g. HANA, Hazel cast, etc) and other database systems - traditional RDBMS (e.g. Teradata, SQL Server, Oracle), and NoSQL databases (e.g. Cassandra, MongoDB, DynamoDB).
- Practical knowledge across data extraction and transformation tools - traditional ETL tools (e.g. Informatica, Ab Initio, Altryx) as well as more recent big data tools.
- Educational.
- Background in programming, databases and/or big data technologies OR.
- BS/MS in software engineering, computer science, economics or other engineering fields.
- Partner with Data Architect and Data Integration Engineer to enhance/maintain optimal data pipeline architecture aligned to published standards.
- Assemble medium, complex data sets that meet functional /non-functional business requirements.
- Design and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
- Build the infrastructure required for optimal extraction transformation, and loading of data from a wide variety of data sources big data technologies.
- Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics.
- Work with stakeholders including Domain leads, and Teams to assist with data-related technical issues and support their data infrastructure needs.
- Ensure technology footprint adheres to data security policies and procedures related to encryption, obfuscation and role based access.
- Create data tools for analytics and data scientist team members.
- Functional Competency.
- Knowledge of data and analytics framework supporting data lakes, warehouses, marts, reporting, etc.
- Defining data retention policies, monitoring performance and advising any necessary infrastructure changes based on functional and non-functional requirements.
- In depth knowledge of data engineering discipline.
- Extensive experience working with Big Data tools and building data solutions for advanced analytics.
- Minimum of 5+ years' hands-on experience with a strong data background.
- Solid programming skills in Java, Python and SQL.
- Clear hands-on experience with database systems - Hadoop ecosystem, Cloud technologies (e.g. AWS, Azure, Google), in-memory database systems (e.g. HANA, Hazel cast, etc) and other database systems - traditional RDBMS (e.g. Teradata, SQL Server, Oracle), and NoSQL databases (e.g. Cassandra, MongoDB, DynamoDB).
- Practical knowledge across data extraction and transformation tools - traditional ETL tools (e.g. Informatica, Ab Initio, Altryx) as well as more recent big data tools.
Skills:
Automation, Product Owner, Python
Job type:
Full-time
Salary:
negotiable
- The candidate will be responsible for design, and implement new solutions for complex data ingestions from multiple sources to enterprise data products with focus on automation, performance, resilience and scalability, etc.
- Partner with Lead Architect, Data Product Manager (Product Owner) and Lead Data Integration Engineer to create strategic solutions introducing new technologies.
- Work with stakeholders including Management, Domain leads, and Teams to assist with data-related technical issues and support their data infrastructure needs.
- Strong development & programming experience in Informatica (IICS), Python, ADF, Azure Synapse, snowflake, Cosmos and Databricks.
- Solid understanding of databases, real-time integration patterns and ETL/ELT best practices.
- Defining data retention policies, monitoring performance and advising any necessary infrastructure changes based on functional and non-functional requirements.
- Responsible for ensuring enterprise data policies, best practices, standards and processes are followed.
- Write up and maintain technical specifications, design documents and process flow.
- Mentor a team of onshore and offshore development resources to analyze, design, construct and test software development projects focused on analytics and data integration.
- Elaborate user stories for technical team and ensure that the team understands the deliverables.
- Effectively communicate, coordinate & collaborate with business, IT architecture and data teams across multi-functional areas to complete deliverables.
- Provide direction to the Agile development team and stakeholders throughout the project.
- Assist in Data Architecture design, tool selection and data flows analysis.
- Work with large amounts of data, interpret data, analyze results, perform gap analysis and provide ongoing reports.
- Handle ad-hoc analysis & report generation requests from the business.
- Respond to data related inquiries to support business and technical teams.
- 6+years of proven working experience in ETL methodologies, Data integration and data migration. Informatica IICS, Databricks/Spark & Python hands-on development skills a must.
- Clear hands-on experience with database systems - SQL server, Oracle, Azure Synapse, Snowflake and Cosmos, Cloud technologies (e.g., AWS, Azure, Google), and NoSQL databases (e.g., Cosmos, MongoDB, DynamoDB).
- Extensive experience developing complex solutions focused on data ecosystem solutions.
- Extensive knowledge of data and analytics framework supporting data lakes, warehouses, marts, reporting, etc.
- In depth knowledge of data engineering and architecture disciplines.
- Extensive experience working with Big Data tools and building data solutions for advanced analytics.
- Solid understanding of P&C Insurance data.
- Technical expertise regarding data architecture, models and database design development.
- Strong knowledge of and experience with Java, SQL, XML s, Python, ETL frameworks and Databricks.
- Working knowledge/familiarity with Git version control.
- Strong Knowledge of analyzing datasets using Excel.
- Strong analytical skills with the ability to collect, organize, analyze, and disseminate significant amounts of information with attention to detail and accuracy.
- Proficient in learning new technologies with the ability to quickly understand capabilities and work with others to guide these into development.
- Good communication and presentation skills.
- Solid problem solving, decision making and analytical skills.
- Knowledge & working experience with Duck creek is an added plus.
- Knowledge & working experience with Insurity Policy Decisions and/or IEV is an added plus.
- Experience with JIRA.
- Experience being part of high-performance agile teams in a fast-paced environment.
- Must understand the system scope and project objectives to achieve project needs through matrix management and collaboration with other enterprise teams.
- Proven ability to produce results in the analysis, design, testing and deployment of applications.
- Strong team emphasis and relationship building skills; partners well with business and other IT/Data areas.
- Strong coaching / mentoring skills.
- Applies technical knowledge to determine solutions and solve complex problems.
- Ability to be proactive, self-motivated, detail-oriented, creative, inquisitive and persistent.
- Excellent communication and negotiation skills.
- Ability to organize, plan and implement work assignments, juggle competing demands and work under pressure of frequent and tight deadlines.
- The candidate will be responsible for design, and implement new solutions for complex data ingestions from multiple sources to enterprise data products with focus on automation, performance, resilience and scalability, etc.
- Partner with Lead Architect, Data Product Manager (Product Owner) and Lead Data Integration Engineer to create strategic solutions introducing new technologies.
- Work with stakeholders including Management, Domain leads, and Teams to assist with data-related technical issues and support their data infrastructure needs.
- Strong development & programming experience in Informatica (IICS), Python, ADF, Azure Synapse, snowflake, Cosmos and Databricks.
- Solid understanding of databases, real-time integration patterns and ETL/ELT best practices.
- Defining data retention policies, monitoring performance and advising any necessary infrastructure changes based on functional and non-functional requirements.
- Responsible for ensuring enterprise data policies, best practices, standards and processes are followed.
- Write up and maintain technical specifications, design documents and process flow.
- Mentor a team of onshore and offshore development resources to analyze, design, construct and test software development projects focused on analytics and data integration.
- Elaborate user stories for technical team and ensure that the team understands the deliverables.
- Effectively communicate, coordinate & collaborate with business, IT architecture and data teams across multi-functional areas to complete deliverables.
- Provide direction to the Agile development team and stakeholders throughout the project.
- Assist in Data Architecture design, tool selection and data flows analysis.
- Work with large amounts of data, interpret data, analyze results, perform gap analysis and provide ongoing reports.
- Handle ad-hoc analysis & report generation requests from the business.
- Respond to data related inquiries to support business and technical teams.
- 6+years of proven working experience in ETL methodologies, Data integration and data migration. Informatica IICS, Databricks/Spark & Python hands-on development skills a must.
- Clear hands-on experience with database systems - SQL server, Oracle, Azure Synapse, Snowflake and Cosmos, Cloud technologies (e.g., AWS, Azure, Google), and NoSQL databases (e.g., Cosmos, MongoDB, DynamoDB).
- Extensive experience developing complex solutions focused on data ecosystem solutions.
- Extensive knowledge of data and analytics framework supporting data lakes, warehouses, marts, reporting, etc.
- In depth knowledge of data engineering and architecture disciplines.
- Extensive experience working with Big Data tools and building data solutions for advanced analytics.
- Solid understanding of P&C Insurance data.
- Technical expertise regarding data architecture, models and database design development.
- Strong knowledge of and experience with Java, SQL, XML s, Python, ETL frameworks and Databricks.
- Working knowledge/familiarity with Git version control.
- Strong Knowledge of analyzing datasets using Excel.
- Strong analytical skills with the ability to collect, organize, analyze, and disseminate significant amounts of information with attention to detail and accuracy.
- Proficient in learning new technologies with the ability to quickly understand capabilities and work with others to guide these into development.
- Good communication and presentation skills.
- Solid problem solving, decision making and analytical skills.
- Knowledge & working experience with Duck creek is an added plus.
- Knowledge & working experience with Insurity Policy Decisions and/or IEV is an added plus.
- Experience with JIRA.
- Experience being part of high-performance agile teams in a fast-paced environment.
- Must understand the system scope and project objectives to achieve project needs through matrix management and collaboration with other enterprise teams.
- Proven ability to produce results in the analysis, design, testing and deployment of applications.
- Strong team emphasis and relationship building skills; partners well with business and other IT/Data areas.
- Strong coaching / mentoring skills.
- Applies technical knowledge to determine solutions and solve complex problems.
- Ability to be proactive, self-motivated, detail-oriented, creative, inquisitive and persistent.
- Excellent communication and negotiation skills.
- Ability to organize, plan and implement work assignments, juggle competing demands and work under pressure of frequent and tight deadlines.
Experience:
5 years required
Skills:
Python, ETL, Compliance
Job type:
Full-time
Salary:
negotiable
- Design and implement scalable, reliable, and efficient data pipelines for ingesting, processing, and storing large amounts of data from a variety of sources using cloud-based technologies, Python, and PySpark.
- Build and maintain data lakes, data warehouses, and other data storage and processing systems on the cloud.
- Write and maintain ETL/ELT jobs and data integration scripts to ensure smooth and accurate data flow.
- Implement data security and compliance measures to protect data privacy and ensure regulatory compliance.
- Collaborate with data scientists and analysts to understand their data needs and provide them with access to the required data.
- Stay up-to-date on the latest developments in cloud-based data engineering, particularly in the context of Azure, AWS and GCP, and proactively bring new ideas and technologies to the team.
- Monitor and optimize the performance of data pipelines and systems, identifying and resolving any issues or bottlenecks that may arise.
- Bachelor s or Master s degree in Computer Science, Data Science, or a related field.
- Minimum of 5 years of experience as a Data Engineer, with a strong focus on cloud-based data infrastructure.
- Proficient programming skills in Python, Java, or a similar language, with an emphasis on Python.
- Extensive experience with cloud-based data storage and processing technologies, particularly Azure, AWS and GCP.
- Familiarity with ETL/ELT tools and frameworks such as Apache Beam, Apache Spark, or Apache Flink.
- Knowledge of data modeling principles and experience working with SQL databases.
- Strong problem-solving skills and the ability to troubleshoot and resolve issues efficiently.
- Excellent communication and collaboration skills to work effectively with cross-functional teams.
- Location: True Digital Park, Bangkok (Hybrid working).
Experience:
8 years required
Skills:
Scala, Java, Golang
Job type:
Full-time
Salary:
negotiable
- Lead the team technically in improving scalability, stability, accuracy, speed and efficiency of our existing Data systems.
- Build, administer and scale data processing pipelines.
- Be comfortable navigating the following technology stack: Scala, Spark, java, Golang, Python3, scripting (Bash/Python), Hadoop, SQL, S3 etc.
- Improve scalability, stability, accuracy, speed and efficiency of our existing data systems.
- Design, build, test and deploy new libraries, frameworks or full systems for our core systems while keeping to the highest standards of testing and code quality.
- Work with experienced engineers and product owners to identify and build tools to automate many large-scale data management / analysis tasks.
- What You'll need to Succeed.
- Bachelor's degree in Computer Science /Information Systems/Engineering/related field.
- 8+ years of experience in software and data engineering.
- Good experience in Apache Spark.
- Expert level understanding of JVM and either Java or Scala.
- Experience debugging and reasoning about production issues is desirable.
- A good understanding of data architecture principles preferred.
- Any other experience with Big Data technologies / tools.
- SQL experience.
- Analytical problem-solving capabilities & experience.
- Systems administration skills in Linux.
- It's great if you have.
- Good understanding of Hadoop ecosystems.
- Experience working with Open-source products.
- Python/Shell scripting skills.
- Working in an agile environment using test driven methodologies.
- Equal Opportunity Employer.
- At Agoda, we pride ourselves on being a company represented by people of all different backgrounds and orientations. We prioritize attracting diverse talent and cultivating an inclusive environment that encourages collaboration and innovation. Employment at Agoda is based solely on a person's merit and qualifications. We are committed to providing equal employment opportunity regardless of sex, age, race, color, national origin, religion, marital status, pregnancy, sexual orientation, gender identity, disability, citizenship, veteran or military status, and other legally protected characteristics.
- We will keep your application on file so that we can consider you for future vacancies and you can always ask to have your details removed from the file. For more details please read our privacy policy.
- To all recruitment agencies: Agoda does not accept third party resumes. Please do not send resumes to our jobs alias, Agoda employees or any other organization location. Agoda is not responsible for any fees related to unsolicited resumes.
Experience:
3 years required
Job type:
Full-time
Salary:
negotiable
- We are seeking a dedicated Fixed Broadband Technical Support Engineer to join our team. This role is crucial in providing exceptional technical support and troubleshooting for our fixed broadband services. The ideal candidate will possess a strong understanding of network technologies, excellent customer service skills, and the ability to resolve technical issues efficiently.
- Provide 2nd Tier technical support to customers experiencing issues with fixed broadband Network & services.
- Diagnose and troubleshoot connectivity issues, including hardware, software, and network configuration problems.
- Monitor analyze and report network/service performance metrics, proactively identifying potential issues.
- Monitor analyze and report network/service performance metrics, proactively identifying potential issues.
- Collaborate with engineering and product teams to resolve complex technical problems and provide feedback for service improvements.
- Maintain detailed documentation of customer interactions, technical issues, and resolutions in the ticketing system.
- Stay updated on new technologies, products, and industry trends to enhance service quality and effectiveness.
- 24x7 standby in shift to support first line staffs.
- 0 - 3 Years Experiences in Telecomunication or IT Operation.
- Bachelor s degree in Computer Science, Information Technology, Telecommunications, or a related field (or equivalent experience).
- Excellent communication skills, both verbal and written, with an ability to explain technical concepts to non-technical users.
- Customer-oriented mindset with strong problem-solving skills.
- Ability to work independently and as part of a team in a fast-paced environment.
- Basic computer skills Excel,Word,Power point etc. is essential.
- Experience with various broadband technologies (FTTx,MPLS,VLL,VPN,Routing protocol) and associated equipment.
- Relevant certifications (e.g., CompTIA Network+, Cisco Certified Network Associate) are advantageous.
- Programming skills in python,shell script or php are advantageous.
- What happens once you apply?.
- Click Here to find all you need to know about what our typical hiring process looks like.
- We encourage you to consider applying to jobs where you might not meet all the criteria. We recognize that we all have transferrable skills, and we can support you with the skills that you need to develop.
- Encouraging a diverse and inclusive organization is core to our values at Ericsson, that's why we champion it in everything we do. We truly believe that by collaborating with people with different experiences we drive innovation, which is essential for our future growth. We encourage people from all backgrounds to apply and realize their full potential as part of our Ericsson team. Ericsson is proud to be an Equal Opportunity and Affirmative Action employer, learn more.
- We are committed to providing reasonable accommodations to all individuals participating in the application and interview process. If you need assistance or to request an accommodation due to a disabilityplease reach out to Contact Us.
- We are proud to announce Ericsson Thailand have been again officially Great Place to Work Certified in 2023. Every year, more than 10,000 organizations from over 60 countries partner with the Great Place to Work Institute for assessment, benchmarking and planning actions to strengthen their workplace culture and this Certification acknowledges our employees value their employee experience and our workplace culture. .
- Primary country and city: Thailand (TH) || Bangkok.
- Job details: Automated Operations Engineer.
- Primary Recruiter: Sitthinon Charoenkitwayo.
Experience:
3 years required
Skills:
Kubernetes, Automation, Redis
Job type:
Full-time
Salary:
negotiable
- Platform Operations: Manage and operate our Kubernetes platform, ensuring high availability, performance, and security.
- Automation & Tooling: Design, develop, and implement automation solutions for operational tasks, infrastructure provisioning, and application deployment.
- Observability: Build and maintain a comprehensive observability stack (monitoring, logging,tracing) to proactively identify and resolve issues.
- Platform Stability & Performance: Implement and maintain proactive measures to ensure platform stability, performance optimization, and capacity planning.
- Middleware Expertise: Provide support and expertise for critical middleware tools such as RabbitMQ, Redis, and Kafka, ensuring their optimal performance and reliability.
- Incident Response: Participate in our on-call rotation, troubleshoot and resolve production incidents efficiently, and implement preventative measures.
- Collaboration: Collaborate effectively with development and other engineering teams.
- Positive attitude and empathy for others.
- Passion for developing and maintaining reliable, scalable infrastructure.
- A minimum of 3 years working experience in relevant areas.
- Experience in managing and operating Kubernetes in a production environment.
- Experienced with cloud platforms like AWS or GCP.
- Experienced with high availability, high-scale, and performance systems.
- Understanding of cloud-native architectures.
- Experienced with DevSecOps practices.
- Strong scripting and automation skills using languages like Python, Bash, or Go.
- Proven experience in building and maintaining CI/CD pipelines (e.g., Jenkins, GitLab CI).
- Deep understanding of monitoring, logging, and tracing tools and techniques.
- Experience with infrastructure-as-code tools (e.g., Terraform, Ansible).
- Strong understanding of Linux systems administration and networking concepts.
- Experience working with middleware technologies like RabbitMQ, Redis, and Kafka.
- Excellent problem-solving and troubleshooting skills.
- Excellent communication and collaboration skills.
- Strong interest and ability to learn any new technical topic.
- Experience with container security best practices.
- Experience with chaos engineering principles and practices.
- Experience in the Financial Services industry.
- Opportunity to tackle challenging projects in a dynamic environment.
Skills:
Quality Assurance, Assurance, Data Analysis, English
Job type:
Full-time
Salary:
negotiable
- Review Test cases, Test Instruction/Procedure and all documents, including with quality assurance and UAT. (Manual Testing 90%).
- Coordinate with users, stakeholders, related team to query testing questions and/or provide advice and direction for all testing activities on a project.
- Performs moderately complex to complex test data conditioning, functional testing, regression testing and testing validation.
- Provides specific guidance on defects to developers.
- Logs, tracks, and verifies resolution of software and specification defects.
- Executes data analysis of reference data to ensure completeness of fact data used for reporting.
- Mentoring and guiding to junior team members.
- Bachelor s degree or higher in area of Computer Science, Computer Engineering, Information Technology or related fields.
- 1-3 years' experience in IT(Quality Assurance) and Software testing or related fields.
- Solid organizational skills including attention to detail and multitasking skills.
- Experience of working within both Agile and Waterfall frameworks.
- Solid knowledge and practical experience working within the SDLC (Software Development Life Cycle) and STLC (Software Test Life Cycle), as well as standard processes and industry best practice.
- Experience in creating, executing and maintaining test cases including test plans/strategies.
- Good command of English.
Experience:
4 years required
Skills:
Business Development, Statistics, Finance, English
Job type:
Full-time
Salary:
negotiable
- Be part of an engagement advisory team to Develop/Validate/Enhance Credit Risk models (e.g. IFRS 9 ECL Model, Credit Scoring / Scorecard, and Credit Rating) based on industry best practices. You will also be able to learn and work in other quantitative and analytical financial risk areas such as model risk management, business intelligence, machine learning and artificial intelligence.
- Assist in managing / driving the project, team, and client servicing.
- Involve in business development initiatives in the aforementioned areas.
- You will be expected to communicate closely with senior management and client personnel; assist in proposal development; mentor and develop junior team members; and maintain up-to-date knowledge of financial risk management methodologies, current corporate governance and regulatory developments/requirements, both locally and internationally
- Your role as a leader:At Deloitte, we believe in the importance of empowering our people to be leaders at all levels. We connect our purpose and shared values to identify issues as well as to make an impact that matters to our clients, people and the communities. Additionally, Senior Associates / Senior Consultants / Assistant Managers across our Firm are expected to:Actively seek out developmental opportunities for growth, act as strong brand ambassadors for the firm as well as share their knowledge and experience with others.
- Respect the needs of their colleagues and build up cooperative relationships.
- Understand the goals of our internal and external stakeholder to set personal priorities as well as align their teams work to achieve the objectives.
- Constantly challenge themselves, collaborate with others to deliver on tasks and take accountability for the results.
- Build productive relationships and communicate effectively in order to positively influence teams and other stakeholders.
- Offer insights based on a solid understanding of what makes Deloitte successful.
- Project integrity and confidence while motivating others through team collaboration as well as recognising individual strengths, differences, and contributions.
- Understand disruptive trends and promote potential opportunities for improvement.
- You are someone with:4-8 years of relevant experience spent within a credit risk model development or model validation team at major banks / financial institutions or consulting firms.
- Solid academic background with a Degree in Statistics, Data Science / AI, Financial Engineering, Quantitative Finance, or other relevant post graduate degree.
- Solid knowledge of common practices in credit risk models, including IFRS 9 expected credit losses (PD, LGD, EAD) and credit scoring / scorecard (Application, Behavioural, Credit Rating) methodologies.
- Solid knowledge of supervisory/regulatory requirements as it pertains to credit risk models, including IFRS 9 ECL and Basel.
- Foundation knowledge in statistics and machine learning (e.g. Classification, Regression, Clustering, Hypothesis testing).
- Hands-on data processing, reporting/visualization and modelling skill in pertinent languages such as Python, R, SAS, and Excel(VBA).
- Strong critical thinking and analytical problem-solving abilities.
- Ability to communicate complex quantitative analysis in a clear, precise manner.
- Proficiency in English & Thai.
- For male, Certificate of Military Exemption is a must.
- We offer the successful candidate an attractive remuneration package and the opportunity to work in a dynamic and exciting environment Successful candidates will have an opportunity to develop their technical knowledge of the financial risk management as well to work with a number of high profile both local and international financial institutions.
- Due to volume of applications, we regret only shortlisted candidates will be notified.
- Please note that Deloitte will never reach out to you directly via messaging platforms to offer you employment opportunities or request for money or your personal information. Kindly apply for roles that you are interested in via this official Deloitte website.Requisition ID: 101624In Thailand, the services are provided by Deloitte Touche Tohmatsu Jaiyos Co., Ltd. and other related entities in Thailand ("Deloitte in Thailand"), which are affiliates of Deloitte Southeast Asia Ltd. Deloitte Southeast Asia Ltd is a member firm of Deloitte Touche Tohmatsu Limited. Deloitte in Thailand, which is within the Deloitte Network, is the entity that is providing this Website.
- 1
- 2