- No elements found. Consider changing the search query.
Experience:
5 years required
Skills:
Python, ETL, Compliance
Job type:
Full-time
Salary:
negotiable
- Design and implement scalable, reliable, and efficient data pipelines for ingesting, processing, and storing large amounts of data from a variety of sources using cloud-based technologies, Python, and PySpark.
- Build and maintain data lakes, data warehouses, and other data storage and processing systems on the cloud.
- Write and maintain ETL/ELT jobs and data integration scripts to ensure smooth and accurate data flow.
- Implement data security and compliance measures to protect data privacy and ensure regulatory compliance.
- Collaborate with data scientists and analysts to understand their data needs and provide them with access to the required data.
- Stay up-to-date on the latest developments in cloud-based data engineering, particularly in the context of Azure, AWS and GCP, and proactively bring new ideas and technologies to the team.
- Monitor and optimize the performance of data pipelines and systems, identifying and resolving any issues or bottlenecks that may arise.
- Bachelor s or Master s degree in Computer Science, Data Science, or a related field.
- Minimum of 5 years of experience as a Data Engineer, with a strong focus on cloud-based data infrastructure.
- Proficient programming skills in Python, Java, or a similar language, with an emphasis on Python.
- Extensive experience with cloud-based data storage and processing technologies, particularly Azure, AWS and GCP.
- Familiarity with ETL/ELT tools and frameworks such as Apache Beam, Apache Spark, or Apache Flink.
- Knowledge of data modeling principles and experience working with SQL databases.
- Strong problem-solving skills and the ability to troubleshoot and resolve issues efficiently.
- Excellent communication and collaboration skills to work effectively with cross-functional teams.
- Location: True Digital Park, Bangkok (Hybrid working).
Skills:
Power BI, SQL, Python, English
Job type:
Full-time
Salary:
negotiable
- Understand and document business requirements for developing data solutions.
- Develop dashboards, reports, and analyses that provide actionable insights.
- Balance retail/wholesale business acumen with data management expertise and technical proficiency.
- Collaborate closely with data engineers, data scientists, and business partners.
- Contribute to the delivery of robust, scalable data solutions with a focus on performance, security, and governance.
- RequirementsBachelor s degree in a STEM or related fields.
- 1-3 years of experience as a data analyst, preferably in the retail or wholesale industries.
- Proficiency in Power BI, SQL, and Python.
- Basic understanding of cloud data platforms.
- Ability to translate business requirements into effective data solutions.
- Strong communication skills, including:Fluency in verbal and written English.
- Ability to clearly articulate technical concepts to stakeholders with varying technical backgrounds.
Skills:
ETL, Python, Java
Job type:
Full-time
Salary:
negotiable
- Design, develop, and maintain scalable data pipelines and ETL processes.
- Implement and optimize data storage solutions, including data warehouses and data lakes.
- Collaborate with data scientists and analysts to understand data requirements and provide efficient data access.
- Ensure data quality, consistency, and reliability across all data systems.
- Develop and maintain data models and schemas.
- Implement data security and access control measures.
- Optimize query performance and data retrieval processes.
- Evaluate and integrate new data technologies and tools.
- Mentor junior data engineers and provide technical leadership.
- Collaborate with cross-functional teams to support data-driven decision-making.
- RequirementsBachelor's or Master's degree in Computer Science, Engineering, or a related field.
- 5+ years of experience in data engineering or related roles.
- Strong programming skills in Python, Java, or Scala.
- Extensive experience with big data technologies such as Hadoop, Spark, and Hive.
- Proficiency in SQL and experience with both relational and NoSQL databases.
- Experience with cloud platforms (AWS, Azure, or GCP) and their data services.
- Knowledge of data modeling, data warehousing, and ETL best practices.
- Familiarity with data visualization tools (e.g., Tableau, Power BI).
- Experience with version control systems (e.g., Git) and CI/CD pipelines.
- Strong problem-solving skills and attention to detail.
- Excellent communication and collaboration skills.
Skills:
ETL, Java, Python
Job type:
Full-time
Salary:
negotiable
- Design, develop, optimize, and maintain data architecture and pipelines that adhere to ETL principles and business goals.
- Create data products for analytics and data scientist team members to improve their productivity.
- Lead the evaluation, implementation and deployment of emerging tools and process for analytic data engineering in order to improve our productivity as a team.
- Develop and deliver communication and education plans on analytic data engineering capabilities, standards, and processes.
- RequirementsPrevious experience as a data engineer or in a similar role.
- Technical expertise with data models, data mining, and segmentation techniques.
- Knowledge of programming languages (e.g. Java and Python).
- Hands-on experience with SQL database design using Hadoop or BigQuery and experience with a variety of relational, NoSQL, and cloud database technologies.
- Great numerical and analytical skill.
- Worked with BI tools such as Tableau, Power BI.
- Conceptual knowledge of data and analytics, such as dimensional modeling, ETL, reporting tools, data governance, data warehousing, structured and unstructured data.
Experience:
2 years required
Skills:
Research, Python, SQL
Job type:
Full-time
Salary:
negotiable
- Develop machine learning models such as credit model, income estimation model and fraud model.
- Research on cutting-edge technology to enhance existing model performance.
- Explore and conduct feature engineering on existing data set (telco data, retail store data, loan approval data).
- Develop sentimental analysis model in order to support collection strategy.
- Bachelor Degree in Computer Science, Operations Research, Engineering, or related quantitative discipline.
- 2-5 years of experiences in programming languages such as Python, SQL or Scala.
- 5+ years of hands-on experience in building & implementing AI/ML solutions for senior role.
- Experience with python libraries - Numpy, scikit-learn, OpenCV, Tensorflow, Pytorch, Flask, Django.
- Experience with source version control (Git, Bitbucket).
- Proven knowledge on Rest API, Docker, Google Big Query, VScode.
- Strong analytical skills and data-driven thinking.
- Strong understanding of quantitative analysis methods in relation to financial institutions.
- Ability to clearly communicate modeling results to a wide range of audiences.
- Nice to have.
- Experience in image processing or natural language processing (NLP).
- Solid understanding in collection model.
- Familiar with MLOps concepts.
Experience:
3 years required
Skills:
Big Data, Hive, SAS
Job type:
Full-time
Salary:
negotiable
- Design, implement, and maintain data analytics pipelines and processing systems.
- Experience of data modelling techniques and integration patterns.
- Write data transformation jobs through code.
- Analyze large datasets to extract insights and identify trends.
- Perform data management through data quality tests, monitoring, cataloging, and governance.
- Knowledge in data infrastructure ecosystem.
- Collaborate with cross-functional teams to identify opportunities to leverage data to drive business outcomes.
- Build data visualizations to communicate findings to stakeholders.
- A willingness to learn and find solutions to complex problems.
- Stay up-to-date with the latest developments in data analytics and science.
- Experience migrating from on-premise data stores to cloud solutions.
- Knowledge of system design and platform thinking to build sustainable solution.
- Practical experience with modern and traditional Big Data stacks (e.g BigQuery, Spark, Databricks, duckDB, Impala, Hive, etc).
- Experience working with data warehouse solutions ELT solutions, tools, and techniques (e.g. Airflow, dbt, SAS, Matillion, Nifi).
- Experience with agile software delivery and CI/CD processes.
- Bachelor's or Master's degree in computer science, statistics, engineering, or a related field.
- At least 3 years of experience in data analysis and modeling.
- Proficiency in Python, and SQL.
- Experience with data visualization tools such as Tableau, Grafana or similar.
- Familiarity with cloud computing platforms, such as GCP, AWS or Databricks.
- Strong problem-solving skills and the ability to work independently as well as collaboratively.
- This role offers a clear path to advance into machine learning and AI with data quality and management, providing opportunities to work on innovative projects and develop new skills in these exciting fields..
- Contact: [email protected] (K.Thipwimon).
- ท่านสามารถอ่านและศึกษานโยบายความเป็นส่วนตัวของธนาคารกรุงไทย จำกัด (มหาชน) ที่ https://krungthai.com/th/content/privacy-policy ทั้งนี้ ธนาคารไม่มีเจตนาหรือความจำเป็นใดๆ ที่จะประมวลผลข้อมูลส่วนบุคคลที่มีความอ่อนไหว รวมถึงข้อมูลที่เกี่ยวข้องศาสนาและ/หรือหมู่โลหิต ซึ่งอาจปรากฏอยู่ในสำเนาบัตรประจำตัวประชาชนของท่านแต่อย่างใด ดังนั้น กรุณาอย่าอัปโหลดเอกสารใดๆ รวมถึงสำเนาบัตรประจำตัวประชาชน หรือกรอกข้อมูลส่วนบุคคลที่มีความอ่อนไหวหรือข้อมูลอื่นใด ซึ่งไม่เกี่ยวข้องหรือไม่จำเป็นสำหรับวัตถุประสงค์ในการสมัครงานไว้บนเว็บไซต์ นอกจากนี้ กรุณาดำเนินการให้แน่ใจว่าได้ดำเนินการลบข้อมูลส่วนบุคคลที่มีความอ่อนไหว (ถ้ามี) ออกจากเรซูเม่และเอกสารอื่นใดก่อนที่จะอัปโหลดเอกสารดังกล่าวไว้บนเว็บไซต์แล้วด้วย ทั้งนี้ ธนาคารมีความจำเป็นต้องเก็บรวบรวมข้อมูลส่วนบุคคลเกี่ยวกับประวัติอาชญากรรมของท่านเพื่อบรรลุวัตถุประสงค์ในการพิจารณารับบุคคลเข้าทำงาน หรือการตรวจสอบคุณสมบัติ ลักษณะต้องห้าม หรือพิจารณาความเหมาะสมของบุคคลที่จะให้ดำรงตำแหน่ง ซึ่งการให้ความยินยอมเพื่อเก็บรวบรวม ใช้ หรือเปิดเผยข้อมูลส่วนบุคคลเกี่ยวกับประวัติอาชญากรรมของท่านมีความจำเป็นสำหรับการเข้าทำสัญญาและการได้รับการพิจารณาตามวัตถุประสงค์ดังกล่าวข้างต้น ในกรณีที่ท่านไม่ให้ความยินยอมในการเก็บรวบรวม ใช้ หรือเปิดเผยข้อมูลส่วนบุคคลเกี่ยวกับประวัติอาชญากรรม หรือมีการถอนความยินยอมในภายหลัง ธนาคารอาจไม่สามารถดำเนินการเพื่อบรรลุวัตถุประสงค์ดังกล่าวข้างต้นได้ และอาจ ทำให้ท่านสูญเสียโอกาสในการได้รับการพิจารณารับเข้าทำงานกับธนาคาร .
Skills:
ETL, SQL, Hadoop
Job type:
Full-time
Salary:
negotiable
- Conduct meeting with users to understand the data requirements and perform database design based on data understanding and requirements with consideration for performance.
- Maintain data dictionary, relationship and its interpretation.
- Analyze problem and find resolution, as well as work closely with administrators to monitor performance and advise any necessary infrastructure changes.
- Work with business domain experts, data scientists and application developers to identify data that is relevant for analysis.
- Develop big data solutions for batch processing and near real-time streaming.
- Own end-to-end data ETL/ELT process framework from Data Source to Data warehouse.
- Select and integrate appropriate tools and frameworks required to provide requested capabilities.
- Design and develop BI solutions.
- Hands-on development mentality, with a willingness to troubleshoot and solve complex problems.
- Keep abreast of new developments in the big data ecosystem and learn new technologies.
- Ability to effectively work independently and handle multiple priorities.
- Master degree or higher in Computer Science, Computer Engineering, Information Technology, Management Information System or an IT related field.
- 3+ year's experience in Data Management or Data Engineer (Retail or E-Commerce business is preferable).
- Expert experience in query language (SQL), Databrick SQL, PostgreSQL.
- Experience in BigData Technologies like Hadoop, Apache Spark, Databrick.
- Experience in Python is a must.
- Knowledge in machine/statistical learning, data mining is a plus.
- Experience in Generative AI is a plus.
- Strong analytical, problem solving, communication and interpersonal skills.
- Having good attitude toward team working and willing to work hard.
- CP AXTRA | Lotus's
- CP AXTRA Public Company Limited.
- Nawamin Office: Buengkum, Bangkok 10230, Thailand.
- By applying for this position, you consent to the collection, use and disclosure of your personal data to us, our recruitment firms and all relevant third parties for the purpose of processing your application for this job position (or any other suitable positions within Lotus's and its subsidiaries, if any). You understand and acknowledge that your personal data will be processed in accordance with the law and our policy. .
Experience:
3 years required
Skills:
Big Data, ETL, SQL, Python
Job type:
Full-time
Salary:
negotiable
- Develop and maintain robust data pipelines to ingest, process, and transform raw data into formats suitable for LLM training.
- Conduct meeting with users to understand the data requirements and perform database design based on data understanding and requirements with consideration for performance.
- Maintain data dictionary, relationship and its interpretation.
- Analyze problem and find resolution, as well as work closely with administrators to monitor performance and advise any necessary infrastructure changes.
- Work with business domain experts, data scientists and application developers to identify data that is relevant for analysis.
- Develop big data solutions for batch processing and near real-time streaming.
- Own end-to-end data ETL/ELT process framework from Data Source to Data warehouse.
- Select and integrate appropriate tools and frameworks required to provide requested capabilities.
- Design and develop BI solutions.
- Hands-on development mentality, with a willingness to troubleshoot and solve complex problems.
- Keep abreast of new developments in the big data ecosystem and learn new technologies.
- Ability to effectively work independently and handle multiple priorities.
- Bachelor degree or higher in Computer Science, Computer Engineering, Information Technology, Management Information System or an IT related field.
- 3+ year's experiences in Data Management or Data Engineer (Retail or E-Commerce business is preferrable).
- Expert experience in query language (SQL), Databrick SQL, PostgreSQL.
- Experience in Big Data Technologies like Hadoop, Apache Spark, Databrick.
- Experience in Python is a must.
- Experience in Generative AI is a must.
- Knowledge in machine/statistical learning, data mining is a plus.
- Strong analytical, problem solving, communication and interpersonal skills.
- Having good attitude toward team working and willing to work hard.
- CP AXTRA | Lotus's
- CP AXTRA Public Company Limited.
- Nawamin Office: Buengkum, Bangkok 10230, Thailand.
- By applying for this position, you consent to the collection, use and disclosure of your personal data to us, our recruitment firms and all relevant third parties for the purpose of processing your application for this job position (or any other suitable positions within Lotus's and its subsidiaries, if any). You understand and acknowledge that your personal data will be processed in accordance with the law and our policy. .
Experience:
No experience required
Skills:
Mechanical Engineering, Electrical Engineering, English
Job type:
Full-time
- Provide day to day installation, maintenance, and repair of all facilities in the data center.
- 24x7 shift work responsibility when qualified and designated.
- Provide requested reporting and documentation.
- Support of facility, development, and construction teams.
- Perform tasks as assigned by DC operation manager.
- Respond to customer requests, power, cooling, and facility audits.
- First tier investigate any power, communication, or cooling anomalies.
- Attend assigned meetings and training.
- Assist in ensuring customer compliance with GSA Acceptance Usage Policy (AUP).
- Provide technical escort when needed.
- Job Qualifications.
- Must be familiar with safety requirements and OSHA regulations or Thailand safety regulations.
- Basic understanding of electrical and mechanical systems that may be employed in a data center environment. This may include electrical feeders, transformers, generators, switchgear, UPS systems, DC power systems, ATS/STS units, PDU units, air handling units, cooling towers, and fire suppression systems.
- Familiar with Interpret wiring diagrams, schematics, and electrical drawings.
- Ability to express ideas clearly, concisely, and effectively with contractors performing maintenance or upgrades on systems installed in the data center environment.
- Excellent verbal, written, and interpersonal communication skills.
- Ability to analyze and make suggestions for problem resolution.
- Solve problems with good initiative and sound judgment.
- Creativity, problem solving skills, negotiation and systematic thinking.
- Fluent in English both written and verbal (Minimum 500 TOEIC score).
- Goal-Oriented, Unity, Learning, Flexible.
Skills:
Automation, Product Owner, Python
Job type:
Full-time
Salary:
negotiable
- The candidate will be responsible for design, and implement new solutions for complex data ingestions from multiple sources to enterprise data products with focus on automation, performance, resilience and scalability, etc.
- Partner with Lead Architect, Data Product Manager (Product Owner) and Lead Data Integration Engineer to create strategic solutions introducing new technologies.
- Work with stakeholders including Management, Domain leads, and Teams to assist with data-related technical issues and support their data infrastructure needs.
- Strong development & programming experience in Informatica (IICS), Python, ADF, Azure Synapse, snowflake, Cosmos and Databricks.
- Solid understanding of databases, real-time integration patterns and ETL/ELT best practices.
- Defining data retention policies, monitoring performance and advising any necessary infrastructure changes based on functional and non-functional requirements.
- Responsible for ensuring enterprise data policies, best practices, standards and processes are followed.
- Write up and maintain technical specifications, design documents and process flow.
- Mentor a team of onshore and offshore development resources to analyze, design, construct and test software development projects focused on analytics and data integration.
- Elaborate user stories for technical team and ensure that the team understands the deliverables.
- Effectively communicate, coordinate & collaborate with business, IT architecture and data teams across multi-functional areas to complete deliverables.
- Provide direction to the Agile development team and stakeholders throughout the project.
- Assist in Data Architecture design, tool selection and data flows analysis.
- Work with large amounts of data, interpret data, analyze results, perform gap analysis and provide ongoing reports.
- Handle ad-hoc analysis & report generation requests from the business.
- Respond to data related inquiries to support business and technical teams.
- 6+years of proven working experience in ETL methodologies, Data integration and data migration. Informatica IICS, Databricks/Spark & Python hands-on development skills a must.
- Clear hands-on experience with database systems - SQL server, Oracle, Azure Synapse, Snowflake and Cosmos, Cloud technologies (e.g., AWS, Azure, Google), and NoSQL databases (e.g., Cosmos, MongoDB, DynamoDB).
- Extensive experience developing complex solutions focused on data ecosystem solutions.
- Extensive knowledge of data and analytics framework supporting data lakes, warehouses, marts, reporting, etc.
- In depth knowledge of data engineering and architecture disciplines.
- Extensive experience working with Big Data tools and building data solutions for advanced analytics.
- Solid understanding of P&C Insurance data.
- Technical expertise regarding data architecture, models and database design development.
- Strong knowledge of and experience with Java, SQL, XML s, Python, ETL frameworks and Databricks.
- Working knowledge/familiarity with Git version control.
- Strong Knowledge of analyzing datasets using Excel.
- Strong analytical skills with the ability to collect, organize, analyze, and disseminate significant amounts of information with attention to detail and accuracy.
- Proficient in learning new technologies with the ability to quickly understand capabilities and work with others to guide these into development.
- Good communication and presentation skills.
- Solid problem solving, decision making and analytical skills.
- Knowledge & working experience with Duck creek is an added plus.
- Knowledge & working experience with Insurity Policy Decisions and/or IEV is an added plus.
- Experience with JIRA.
- Experience being part of high-performance agile teams in a fast-paced environment.
- Must understand the system scope and project objectives to achieve project needs through matrix management and collaboration with other enterprise teams.
- Proven ability to produce results in the analysis, design, testing and deployment of applications.
- Strong team emphasis and relationship building skills; partners well with business and other IT/Data areas.
- Strong coaching / mentoring skills.
- Applies technical knowledge to determine solutions and solve complex problems.
- Ability to be proactive, self-motivated, detail-oriented, creative, inquisitive and persistent.
- Excellent communication and negotiation skills.
- Ability to organize, plan and implement work assignments, juggle competing demands and work under pressure of frequent and tight deadlines.
- The candidate will be responsible for design, and implement new solutions for complex data ingestions from multiple sources to enterprise data products with focus on automation, performance, resilience and scalability, etc.
- Partner with Lead Architect, Data Product Manager (Product Owner) and Lead Data Integration Engineer to create strategic solutions introducing new technologies.
- Work with stakeholders including Management, Domain leads, and Teams to assist with data-related technical issues and support their data infrastructure needs.
- Strong development & programming experience in Informatica (IICS), Python, ADF, Azure Synapse, snowflake, Cosmos and Databricks.
- Solid understanding of databases, real-time integration patterns and ETL/ELT best practices.
- Defining data retention policies, monitoring performance and advising any necessary infrastructure changes based on functional and non-functional requirements.
- Responsible for ensuring enterprise data policies, best practices, standards and processes are followed.
- Write up and maintain technical specifications, design documents and process flow.
- Mentor a team of onshore and offshore development resources to analyze, design, construct and test software development projects focused on analytics and data integration.
- Elaborate user stories for technical team and ensure that the team understands the deliverables.
- Effectively communicate, coordinate & collaborate with business, IT architecture and data teams across multi-functional areas to complete deliverables.
- Provide direction to the Agile development team and stakeholders throughout the project.
- Assist in Data Architecture design, tool selection and data flows analysis.
- Work with large amounts of data, interpret data, analyze results, perform gap analysis and provide ongoing reports.
- Handle ad-hoc analysis & report generation requests from the business.
- Respond to data related inquiries to support business and technical teams.
- 6+years of proven working experience in ETL methodologies, Data integration and data migration. Informatica IICS, Databricks/Spark & Python hands-on development skills a must.
- Clear hands-on experience with database systems - SQL server, Oracle, Azure Synapse, Snowflake and Cosmos, Cloud technologies (e.g., AWS, Azure, Google), and NoSQL databases (e.g., Cosmos, MongoDB, DynamoDB).
- Extensive experience developing complex solutions focused on data ecosystem solutions.
- Extensive knowledge of data and analytics framework supporting data lakes, warehouses, marts, reporting, etc.
- In depth knowledge of data engineering and architecture disciplines.
- Extensive experience working with Big Data tools and building data solutions for advanced analytics.
- Solid understanding of P&C Insurance data.
- Technical expertise regarding data architecture, models and database design development.
- Strong knowledge of and experience with Java, SQL, XML s, Python, ETL frameworks and Databricks.
- Working knowledge/familiarity with Git version control.
- Strong Knowledge of analyzing datasets using Excel.
- Strong analytical skills with the ability to collect, organize, analyze, and disseminate significant amounts of information with attention to detail and accuracy.
- Proficient in learning new technologies with the ability to quickly understand capabilities and work with others to guide these into development.
- Good communication and presentation skills.
- Solid problem solving, decision making and analytical skills.
- Knowledge & working experience with Duck creek is an added plus.
- Knowledge & working experience with Insurity Policy Decisions and/or IEV is an added plus.
- Experience with JIRA.
- Experience being part of high-performance agile teams in a fast-paced environment.
- Must understand the system scope and project objectives to achieve project needs through matrix management and collaboration with other enterprise teams.
- Proven ability to produce results in the analysis, design, testing and deployment of applications.
- Strong team emphasis and relationship building skills; partners well with business and other IT/Data areas.
- Strong coaching / mentoring skills.
- Applies technical knowledge to determine solutions and solve complex problems.
- Ability to be proactive, self-motivated, detail-oriented, creative, inquisitive and persistent.
- Excellent communication and negotiation skills.
- Ability to organize, plan and implement work assignments, juggle competing demands and work under pressure of frequent and tight deadlines.
Skills:
Data Analysis, SQL, Problem Solving, English
Job type:
Full-time
Salary:
negotiable
- Working closely with business and technical domain experts to identify data requirements that are relevant for analytics and business intelligence.
- Implement data solutions and data comprehensiveness for data customers.
- Working closely with engineering to ensure data service solutions are ultimately delivered in a timely and cost effective manner.
- Retrieve and prepare data (automated if possible) to support business data analysis.
- Ensure adherence to the highest standards in data privacy protection and data governance.
- Bachelor s of Master s Degree in Computer Science, Computer Engineering, or related.
- Minimum of 1 Year with relational/non-relational database systems and good command in SQL.
- Ability to meet critical deadlines and prioritize multiple tasks in a fast-paced environment.
- Ability to work independently, have strong problem solving and organization skills, with a high initiative and a sense of accountability and ownership.
- Experience with cloud-based platforms such as AWS, Google Cloud platform or similar.
- Experience in data processing batch / real time / nearly realtime.
- Experience with data integration or ETL management tools such as AWS Glue, Databrick,or similar.
- Experience with web or software development with Java,Python or similar.
- Experience with Agile methodology is a plus.
- Good in communication and writing in English.
- Good interpersonal and communication skills.
Skills:
Software Development, PHP, Golang, English
Job type:
Full-time
Salary:
negotiable
- Lead the design and implementation of high-quality software applications, ensuring best practices are followed.
- Collaborate with cross-functional teams to define, design, and deliver new features and enhancements.
- Mentor and guide junior engineers, fostering their technical development and growth.
- Conduct thorough code reviews to maintain high coding standards and ensure overall code quality.
- Optimize application performance and scalability, identifying opportunities for improvement.
- Design system architecture with a focus on security and adherence to programming standards.
- Solve complex technical challenges and provide strategic, scalable solutions.
- Bachelor s degree in Computer Science, Software Engineering, or a related field.
- 3 years plus of experience in software development.
- A Master s degree or additional certifications in relevant areas is a plus.
- Programming Language Proficiency: Strong expertise in PHP, Golang, NodeJS, and TypeScript.
- Experience with Programming Frameworks: Proficient in Go-Fiber, Go-Gin, ReactJS, NextJS, AngularJS, Laravel, and CodeIgniter.
- Database Experience: Hands-on experience with databases such as MongoDB, MariaDB, MySQL, and PostgreSQL.
- Strong understanding of data structures and algorithms.
- Expertise in system architecture design and development.
- In-depth knowledge of security programming standards and best practices.
- Advanced technical problem-solving abilities, with a proven ability to address complex issues.
- Possesses a positive attitude and participates in team-building and events.
- Comfortable presenting technical information and project updates to both technical and non-technical stakeholders.
- Skilled in using AI to solve complex problems, leading to improved outcomes.
- Be able to communicate in both Thai and English.
- Experience with reactive programming techniques and frameworks.
- Knowledge of cloud computing environments and microservices architecture design and implementation.
- Familiarity with DevOps practices and tools, including continuous integration and deployment processes.
- Remark: Given the nature of the mentioned position, where employees are involved with customer data and asset values, and/or the company, to comply with legal and regulatory standards established by the Securities and Exchange Commission, as well as to align with laws and overseeing agencies, the company requires a criminal background check as part of the post-interview process before joining the company. Your criminal history information will be retained for a period of 6 months from the start date.
- Important: https://careers.bitkub.com/privacy.
Skills:
Power BI, SQL, Python, English
Job type:
Full-time
Salary:
negotiable
- Understand and document the business requirements for developing effective data solutions.
- Develop dashboards, reports, and analyses that provide actionable insights.
- Balance retail/wholesale business acumen with data management expertise and technical proficiency.
- Collaborate closely with data engineers, data scientists, and business partners.
- Deliver robust, scalable data solutions with a focus on speed, performance, security, governance, and architecture.
- Mentor junior data analysts and promote best practices within the team.
- Requirements5+ years of experience as a data analyst, preferably in retail or wholesale industries.
- Proven expertise in Power BI, SQL, Python and Cloud Data Platforms.
- Demonstrated ability to translate complex business requirements into effective data solutions.
- Strong leadership skills, including:Mentoring and developing junior analysts.
- Managing cross-functional projects.
- Guiding teams through complex BI development initiatives.
- Excellent communication skills, including:Fluency in verbal and written English.
- Ability to clearly articulate technical concepts to stakeholders with varying technical backgrounds.
- Track record of successful collaboration with various departments and stakeholders.
Experience:
3 years required
Skills:
English
Job type:
Full-time
Salary:
negotiable
- Responsible for planning preventive maintenance schedules for air condition & Fire protection systems.
- Responsible for coordinating and managing vendors and suppliers to preventive maintenance and payment plans.
- 2nd Level support to Data Center Operation (FOC), on site to solve Incident and Problem management.
- 2nd Level support to engineer team all site, Data Center (TT1, TT2, MTG, BNA).
- To create & update reports and documents to comply with ISO 20k, 22k, 27k, 50k & TCOS standards.
- Review PUE, cost saving energy and report.
- Measured Efficiency air system and record annual report.
- Responsible for implementation of Mechanical such as comfort air, precision air.
- Responsible for implementation of Fire suppression such as FM200, NOVEC, CO2, Fire Sprinkler, Fire Pump, Fire alarm, and Vesda.
- Working period office time 9:00 - 18:00 and able to standby on call or onsite to work on holiday.
- Bachelor degree of Engineering, Mechanical engineering or related field.
- At Least 3 years experience in maintenance air conditioning such as comfort air, precision air, chiller air cool, water cool, pump motor: implement and support for mechanics-air condition systems in buildings or Data Centers.
- At least 1 years experience in designing air conditioners such as comfort air, precision air, chiller air cool, water cool, pump motor: implement, and support for mechanics-air condition systems in building.
- Able to Air - Diagram and Psychrometric chart knowledge.
- Able to work as a team and work in and standby on call on holiday.
- Able to work overtime if required and a hotline arrives (Less than 1 hour on site from your home).
- Proficiency in English communication is beneficial.
- Work Location: TrueIDC - Bangna Site (KM26).
Skills:
ISO 27001, English
Job type:
Full-time
Salary:
negotiable
- รับผิดชอบการ Monitoring ควบคุมและจัดการระบบพื้นฐานเกี่ยวกับ ไฟฟ้า และระบบปรับอากาศ ระบบเครือข่าย เพื่อสนับสนุนการจัดการ.
- ตอบสนองความต้องการของลูกค้า และประสานงาน การติดตั้งและการแก้ไขปัญหาระบบของผู้บริการ (vendor) เพื่อให้ถูกต้องและสมบูรณ์ตามหลักปฎิบัติ.
- ควบคุมและประสานงานการบำรุงรักษาและการซ่อมแซม (Preventive Maintenance) ระบบพื้นฐานต่างๆ เครื่องกำเนิดไฟฟ้า Generator, เครื่องสำรองไฟฟ้า UPS, ระบบตู้ไฟฟ้า, ระบบปรับอากาศ และการติดตั้งอุปกรณ์ระบบเครือข่าย (Network) เป็นต้น.
- เป็น 1st level support & troubleshooting ของระบบ Facility ใน Data Center เช่น ระบบ Network, ระบบไฟฟ้า, ระบบปรับอากาศ เป็นต้น.
- จัดทำกระบวนการปฎิบัติงาน และคู่มือการทำงานในการดูแลระบบพื้นฐาน โดยอิงตามมาตราฐาน ISO หรือมาตรฐานอื่นที่เกี่ยวข้องกับการปฏิบัติงาน (เช่น ISO 20000 ด้านบริการ, ISO 27001 ด้านความปลอดภัย,ISO 50001 ด้านบริหารพลังงาน และอื่นๆ เช่น ISO22301, PCIDSS, TCOS) รวมทั้งรูปแบบใบบันทึก, รายงานต่าง ๆ.
- สรุปและรายงานผลสำหรับปัญหาวิกฤติใด ๆ ต่อหัวหน้าทีม รวมทั้ง การจัดทำรายงานสถิติ,รายงานวิเคราะห์แบบรายวัน, รายเดือน รายไตรมาส ด้วย.
- Bachelor s degree in electrical power, mechanic or related fields.
- Thai nationality, Male, Age 20 - 25 years old.
- Have basic technical knowledge in Data Center facilities (Electrical/Mechanical).
- Able to work under pressure.
- Able to work with a team.
- Fair communication in English.
Experience:
5 years required
Skills:
AutoCAD, Visio, English
Job type:
Full-time
Salary:
negotiable
- Responsible for planning preventive maintenance schedules for the electrical system.
- Responsible for coordinating and managing vendors and suppliers to preventive maintenance and payment plans.
- 2nd Level support to Data Center Operation (FOC), on site to solve Incident and Problem management.
- 2nd Level support to engineer team all site, Data Center (TT1, TT2, MTG, BNA).
- To create & update reports and documents to comply with ISO 20k, 22k, 27k, 50k & TCOS standards.
- Review PUE, cost saving energy and report.
- Measured Efficiency air system and record annual report.
- Responsible for implementing Electrical such as MU, TR, MDB, GEN, UPS, RECT, BATT, ATS.
- Bachelor degree of Engineering, Electrical engineering or related field.
- More than 5 years of experience in maintenance of electrical systems such as RMU, TR, MDB, GEN, UPS, RECT, BATT, ATS: implement and support electrical systems in buildings or Data Centers.
- At least 1 years experience in designing electrical systems (such as RMU, TR, MDB, GEN, UPS, RECT, BATT, ATS). implement, and support for electrical systems in building.
- Able to use the program AutoCAD, Visio.
- Able to work as a team and work in and standby on call on holiday.
- Able to work overtime if required and a hotline arrives (Less than 1 hour on site from your home).
- Proficiency in English communication is beneficial for both reading and writing.
- Work Location: TrueIDC - Bangna Site (KM26).
Skills:
Big Data, Java, Python
Job type:
Full-time
Salary:
negotiable
- Background in programming, databases and/or big data technologies OR.
- BS/MS in software engineering, computer science, economics or other engineering fields.
- Partner with Data Architect and Data Integration Engineer to enhance/maintain optimal data pipeline architecture aligned to published standards.
- Assemble medium, complex data sets that meet functional /non-functional business requirements.
- Design and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
- Build the infrastructure required for optimal extraction transformation, and loading of data from a wide variety of data sources big data technologies.
- Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics.
- Work with stakeholders including Domain leads, and Teams to assist with data-related technical issues and support their data infrastructure needs.
- Ensure technology footprint adheres to data security policies and procedures related to encryption, obfuscation and role based access.
- Create data tools for analytics and data scientist team members.
- Functional Competency.
- Knowledge of data and analytics framework supporting data lakes, warehouses, marts, reporting, etc.
- Defining data retention policies, monitoring performance and advising any necessary infrastructure changes based on functional and non-functional requirements.
- In depth knowledge of data engineering discipline.
- Extensive experience working with Big Data tools and building data solutions for advanced analytics.
- Minimum of 5+ years' hands-on experience with a strong data background.
- Solid programming skills in Java, Python and SQL.
- Clear hands-on experience with database systems - Hadoop ecosystem, Cloud technologies (e.g. AWS, Azure, Google), in-memory database systems (e.g. HANA, Hazel cast, etc) and other database systems - traditional RDBMS (e.g. Teradata, SQL Server, Oracle), and NoSQL databases (e.g. Cassandra, MongoDB, DynamoDB).
- Practical knowledge across data extraction and transformation tools - traditional ETL tools (e.g. Informatica, Ab Initio, Altryx) as well as more recent big data tools.
- Educational.
- Background in programming, databases and/or big data technologies OR.
- BS/MS in software engineering, computer science, economics or other engineering fields.
- Partner with Data Architect and Data Integration Engineer to enhance/maintain optimal data pipeline architecture aligned to published standards.
- Assemble medium, complex data sets that meet functional /non-functional business requirements.
- Design and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
- Build the infrastructure required for optimal extraction transformation, and loading of data from a wide variety of data sources big data technologies.
- Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics.
- Work with stakeholders including Domain leads, and Teams to assist with data-related technical issues and support their data infrastructure needs.
- Ensure technology footprint adheres to data security policies and procedures related to encryption, obfuscation and role based access.
- Create data tools for analytics and data scientist team members.
- Functional Competency.
- Knowledge of data and analytics framework supporting data lakes, warehouses, marts, reporting, etc.
- Defining data retention policies, monitoring performance and advising any necessary infrastructure changes based on functional and non-functional requirements.
- In depth knowledge of data engineering discipline.
- Extensive experience working with Big Data tools and building data solutions for advanced analytics.
- Minimum of 5+ years' hands-on experience with a strong data background.
- Solid programming skills in Java, Python and SQL.
- Clear hands-on experience with database systems - Hadoop ecosystem, Cloud technologies (e.g. AWS, Azure, Google), in-memory database systems (e.g. HANA, Hazel cast, etc) and other database systems - traditional RDBMS (e.g. Teradata, SQL Server, Oracle), and NoSQL databases (e.g. Cassandra, MongoDB, DynamoDB).
- Practical knowledge across data extraction and transformation tools - traditional ETL tools (e.g. Informatica, Ab Initio, Altryx) as well as more recent big data tools.
Experience:
3 years required
Job type:
Full-time
Salary:
negotiable
- We are seeking a dedicated Fixed Broadband Technical Support Engineer to join our team. This role is crucial in providing exceptional technical support and troubleshooting for our fixed broadband services. The ideal candidate will possess a strong understanding of network technologies, excellent customer service skills, and the ability to resolve technical issues efficiently.
- Provide 2nd Tier technical support to customers experiencing issues with fixed broadband Network & services.
- Diagnose and troubleshoot connectivity issues, including hardware, software, and network configuration problems.
- Monitor analyze and report network/service performance metrics, proactively identifying potential issues.
- Monitor analyze and report network/service performance metrics, proactively identifying potential issues.
- Collaborate with engineering and product teams to resolve complex technical problems and provide feedback for service improvements.
- Maintain detailed documentation of customer interactions, technical issues, and resolutions in the ticketing system.
- Stay updated on new technologies, products, and industry trends to enhance service quality and effectiveness.
- 24x7 standby in shift to support first line staffs.
- 0 - 3 Years Experiences in Telecomunication or IT Operation.
- Bachelor s degree in Computer Science, Information Technology, Telecommunications, or a related field (or equivalent experience).
- Excellent communication skills, both verbal and written, with an ability to explain technical concepts to non-technical users.
- Customer-oriented mindset with strong problem-solving skills.
- Ability to work independently and as part of a team in a fast-paced environment.
- Basic computer skills Excel,Word,Power point etc. is essential.
- Experience with various broadband technologies (FTTx,MPLS,VLL,VPN,Routing protocol) and associated equipment.
- Relevant certifications (e.g., CompTIA Network+, Cisco Certified Network Associate) are advantageous.
- Programming skills in python,shell script or php are advantageous.
- What happens once you apply?.
- Click Here to find all you need to know about what our typical hiring process looks like.
- We encourage you to consider applying to jobs where you might not meet all the criteria. We recognize that we all have transferrable skills, and we can support you with the skills that you need to develop.
- Encouraging a diverse and inclusive organization is core to our values at Ericsson, that's why we champion it in everything we do. We truly believe that by collaborating with people with different experiences we drive innovation, which is essential for our future growth. We encourage people from all backgrounds to apply and realize their full potential as part of our Ericsson team. Ericsson is proud to be an Equal Opportunity and Affirmative Action employer, learn more.
- We are committed to providing reasonable accommodations to all individuals participating in the application and interview process. If you need assistance or to request an accommodation due to a disabilityplease reach out to Contact Us.
- We are proud to announce Ericsson Thailand have been again officially Great Place to Work Certified in 2023. Every year, more than 10,000 organizations from over 60 countries partner with the Great Place to Work Institute for assessment, benchmarking and planning actions to strengthen their workplace culture and this Certification acknowledges our employees value their employee experience and our workplace culture. .
- Primary country and city: Thailand (TH) || Bangkok.
- Job details: Automated Operations Engineer.
- Primary Recruiter: Sitthinon Charoenkitwayo.
Experience:
3 years required
Skills:
Kubernetes, Automation, Redis
Job type:
Full-time
Salary:
negotiable
- Platform Operations: Manage and operate our Kubernetes platform, ensuring high availability, performance, and security.
- Automation & Tooling: Design, develop, and implement automation solutions for operational tasks, infrastructure provisioning, and application deployment.
- Observability: Build and maintain a comprehensive observability stack (monitoring, logging,tracing) to proactively identify and resolve issues.
- Platform Stability & Performance: Implement and maintain proactive measures to ensure platform stability, performance optimization, and capacity planning.
- Middleware Expertise: Provide support and expertise for critical middleware tools such as RabbitMQ, Redis, and Kafka, ensuring their optimal performance and reliability.
- Incident Response: Participate in our on-call rotation, troubleshoot and resolve production incidents efficiently, and implement preventative measures.
- Collaboration: Collaborate effectively with development and other engineering teams.
- Positive attitude and empathy for others.
- Passion for developing and maintaining reliable, scalable infrastructure.
- A minimum of 3 years working experience in relevant areas.
- Experience in managing and operating Kubernetes in a production environment.
- Experienced with cloud platforms like AWS or GCP.
- Experienced with high availability, high-scale, and performance systems.
- Understanding of cloud-native architectures.
- Experienced with DevSecOps practices.
- Strong scripting and automation skills using languages like Python, Bash, or Go.
- Proven experience in building and maintaining CI/CD pipelines (e.g., Jenkins, GitLab CI).
- Deep understanding of monitoring, logging, and tracing tools and techniques.
- Experience with infrastructure-as-code tools (e.g., Terraform, Ansible).
- Strong understanding of Linux systems administration and networking concepts.
- Experience working with middleware technologies like RabbitMQ, Redis, and Kafka.
- Excellent problem-solving and troubleshooting skills.
- Excellent communication and collaboration skills.
- Strong interest and ability to learn any new technical topic.
- Experience with container security best practices.
- Experience with chaos engineering principles and practices.
- Experience in the Financial Services industry.
- Opportunity to tackle challenging projects in a dynamic environment.
Experience:
6 years required
Skills:
Compliance, Legal
Job type:
Full-time
Salary:
negotiable
- Lead the team to handle all quality excursions independently, and take effective actions on time.
- Oversee the product & material quality from NPI to mass production.
- Develop the quality control plan for respective area (material/process/product).
- Monitor and report quality KPI for internal (factory) and external (customer).
- Drive continuous improvement to benefit customer, CLS and supplier.
- Develop and maintain internal quality system, procedures, work instructions and workmanship standards.
- Lead process/product/system/supplier audit and improvement actions follow up.Industry standards (e.g. ISO).
- Compliance audit.
- Safety audit etc.
- Follow up the EC(Engineering Change), SPCN(Supplier Process Change Notification) to ensure the implementation of changes is timely and accurate (initiated both externally and internally).
- Coach junior quality staff to improve their quality knowledge.
- Accomplish the jobs assignment from the superior and participate in the quality strategy deployment.
- Knowledge/Skills/Competencies.
- Strong knowledge of quality tools, ISO and IPC standards and processes.
- Knowledge of software and its uses in generating reports capturing data presenting data in an understandable format.
- Strong knowledge of product and manufacturing processes.
- Knowledge and understanding of the business unit and how decisions impact customer satisfaction product quality,on-time delivery and profitability of the unit.
- Knowledge of quality tools such as FMEA, PMP, SPC, 8D methodology, etc.
- Knowledge of Six sigma and Lean Kaizen.
- Ability to effectively communicate with a wide variety of internal and external customers.
- Physical Demands.
- Duties of this position are performed in a normal office environment.
- Duties may require repetitive manual movements (e.g., keyboarding), carrying, pushing or pulling light objects, (under 5 kg.), carrying, pushing or pulling heavy objects (over 5 kg.), crouching, climbing.
- Sustained visual concentration on small areas, such as monitors, screens, precise eye/hand coordination, sustained visual concentration on numbers, legal documents.
- Typical Experience.
- 4 to 6 years in a similar role or industry.
- Typical Education.
- Bachelor's degree in related field, or consideration of an equivalent combination of education and experiency.
- Educational requirements may vary by geography.
- Notes.
- This job description is not intended to be an exhaustive list of all duties and responsibilities of the position. Employees are held accountable for all duties of the job. Job duties and the % of time identified for any function are subject to change at any time.
- Celestica is an equal opportunity employer. All qualified applicants will receive consideration for employment and will not be discriminated against on any protected status (including race, religion, national origin, gender, sexual orientation, age, marital status, veteran or disability status or other characteristics protected by law).
- At Celestica we are committed to fostering an inclusive, accessible environment, where all employees and customers feel valued, respected and supported. Special arrangements can be made for candidates who need it throughout the hiring process. Please indicate your needs and we will work with you to meet them.
- Celestica (NYSE, TSX: CLS) enables the world s best brands. Through our recognized customer-centric approach, we partner with leading companies in Aerospace and Defense, Communications, Enterprise, HealthTech, Industrial, Capital Equipment and Energy to deliver solutions for their most complex challenges. As a leader in design, manufacturing, hardware platform and supply chain solutions, Celestica brings global expertise and insight at every stage of product development - from drawing board to full-scale production and after-market services for products from advanced medical devices, to highly engineered aviation systems, to next-generation hardware platform solutions for the Cloud. Headquartered in Toronto, with talented teams spanning 40+ locations in 13 countries across the Americas, Europe and Asia, we imagine, develop and deliver a better future with our customers.
- Celestica would like to thank all applicants, however, only qualified applicants will be contacted.
- Celestica does not accept unsolicited resumes from recruitment agencies or fee based recruitment services.
- 1
- 2