1 - 9 of 9 job positions
for keyword Apache
Order by
Please select
- No elements found. Consider changing the search query.
Upload your resume Our AI will read it and recommend you best jobs
Skills:
Cloud Computing, SAP, Linux
Job type:
Full-time
Salary:
negotiable
- Acting as the key of Cloud technical aspect for the consulting team to provide the technical consulting to both internal and external customers.
- Design Cloud solution architecture in response to the client s requirement.
- Provide advisory consulting service to the client regarding the True IDC Consulting practices.
- Create Cloud technical requirements to the client s migration plan.
- Experience of designing and implementing comprehensive Cloud computing solutions on various Cloud technologies e.g. AWS, GCP.
- Experience in building multi-tier Service Oriented Architecture (SOA) applications.
- Experience in SAP Cloud Infrastructure in term of architecture & design in AWS, GCP public cloud.
- Knowledge of Linux, Windows, Apache, IIS, NoSQL operations as its architecture toth e Cloud.
- Knowledge of Containerization administrative for both Windows and Linux technologies.
- Knowledge of key concerns and how they are addressed in Cloud Computing such as security, performance and scalability.
- Good in customer objective handling & Good in customer presentation skill.
- Nice to have.
- UNIX shell scripting.
- AWS Certified Solution Architect - Associate.
- GCP Certified Solution Architect - Associate.
6 days ago
See morekeyboard_arrow_down
SAVE JOB
UNSAVE JOB
Skills:
SQL, Data Warehousing, ETL, English
Job type:
Full-time
Salary:
negotiable
- Work with business domain experts, data scientists and application developers to identify data that is relevant for analysis.
- Retrieve, prepare, and process a rich data variety of data sources.
- Apply data quality, cleaning and semantic inference techniques to maintain high data quality.
- Explore data sources to better understand the availability and quality/integrity of data.
- Gain fluency in AI/ML techniques.
- Experience with relational database systems with expertise in SQL.
- Experience in data management, data warehousing or unstructured data environments.
- Experience with data integration or ETL management tools such as Talend, Apache Airflow, AWS Glue, Google DataFlow or similar.
- Experience programming in Python, Java or other equivalent is a plus.
- Experience with Business Intelligence tools and platforms is a plus, e.g. Tableau, QlikView, Google DataStudio, Google Analytics or similar.
- Experience with Agile methodology and Extreme Programming is a plus.
- Ability to meet critical deadlines and prioritize multiple tasks in a fast-paced environment.
- Ability to work independently, have strong problem solving and organization skills, with a high initiative and a sense of accountability and ownership.
- Good communication in English.
1 day ago
See morekeyboard_arrow_down
SAVE JOB
UNSAVE JOB
Experience:
5 years required
Skills:
Python, ETL, Compliance
Job type:
Full-time
Salary:
negotiable
- Design and implement scalable, reliable, and efficient data pipelines for ingesting, processing, and storing large amounts of data from a variety of sources using cloud-based technologies, Python, and PySpark.
- Build and maintain data lakes, data warehouses, and other data storage and processing systems on the cloud.
- Write and maintain ETL/ELT jobs and data integration scripts to ensure smooth and accurate data flow.
- Implement data security and compliance measures to protect data privacy and ensure regulatory compliance.
- Collaborate with data scientists and analysts to understand their data needs and provide them with access to the required data.
- Stay up-to-date on the latest developments in cloud-based data engineering, particularly in the context of Azure, AWS and GCP, and proactively bring new ideas and technologies to the team.
- Monitor and optimize the performance of data pipelines and systems, identifying and resolving any issues or bottlenecks that may arise.
- Bachelor s or Master s degree in Computer Science, Data Science, or a related field.
- Minimum of 5 years of experience as a Data Engineer, with a strong focus on cloud-based data infrastructure.
- Proficient programming skills in Python, Java, or a similar language, with an emphasis on Python.
- Extensive experience with cloud-based data storage and processing technologies, particularly Azure, AWS and GCP.
- Familiarity with ETL/ELT tools and frameworks such as Apache Beam, Apache Spark, or Apache Flink.
- Knowledge of data modeling principles and experience working with SQL databases.
- Strong problem-solving skills and the ability to troubleshoot and resolve issues efficiently.
- Excellent communication and collaboration skills to work effectively with cross-functional teams.
- Location: True Digital Park, Bangkok (Hybrid working).
2 days ago
See morekeyboard_arrow_down
SAVE JOB
UNSAVE JOB
Skills:
DevOps, Automation, Kubernetes
Job type:
Full-time
Salary:
negotiable
- Managing 7-8 Professional Service Engineers in responsible for AWS cloud solution architecting and implementation/migration according to the project requirements.
- Team resources management.
- Acting as the key of Cloud technical aspect for the consulting team to provide the technical of AWS cloud consulting to customers.
- Design AWS Cloud solution architecture in response to the client s requirement.
- Define the scope of work & estimate mandays for cloud implementation.
- Managing cloud project delivery to meet the customer requirements timeline.
- Support AWS, GCP cloud partner competency building e.g. AWS Certification and delivery professional service process and documentation.
- Speaker of AWS technical side for True IDC webinar, online event for CloudTalk.
- Key Driving for building team competency expansion to meet the competency roadmap yearly strategy e.g. DevOps, IaC, Automation, Kubernetes, App modernization on AWS cloud.
- Experience in leading cloud AWS implementation and delivery team.
- Experience of designing and implementing comprehensive Cloud computing solutions on various Cloud technologies for AWS, GCP is plus.
- Experience in infra as a code in cloud native (Cloud Formation) or other e.g. Terraform, Ansible implementation.
- Experience in building multi-tier Service Oriented Architecture (SOA) applications.
- Knowledge of Linux, Windows, Apache, IIS, NoSQL operations as its architecture to the Cloud.
- Knowledge of OS administrative for both Windows and UNIX technologies.
- Knowledge of key concerns and how they are addressed in Cloud Computing such as security, performance and scalability.
- Knowledge of Kubernetes, Containers and CI/CD, DevOps.
- Experience with RDBMS designing and implementing over the Cloud.
- Prior experience with application development on the various development solutions as Java,.Net, Python etc.
- Experience in,.Net and/or Spring Framework and RESTful web services.
- UNIX shell scripting.
- AWS Certified Solution Architect - Associate, Prefer Professional level.
1 day ago
See morekeyboard_arrow_down
SAVE JOB
UNSAVE JOB
Bangkok, IT / Programming
,Engineering
,Senior Management
IT / Programming,Engineering,Senior Management
Experience:
5 years required
Skills:
Java, Problem Solving, Spring Boot, English
Job type:
Full-time
Salary:
negotiable
- Develop and maintain a suite of multiple software products.
- Each product in the suite uses various development tools (Java and C#.NET).
- Main products in the suite significantly use database programming (connect, complex query, updates, optimization).
- Optimize the applications for performance, stability, and scalability.
- Work as a team, flexible, proactive, well organized and focus on objectives with high standard and quality.
- Essential Skills/Experience Required: Minimum 5 years software engineering experience.
- Required strong programming skills Java and C#.NET.
- Strong database software programming (connect, complex query, updates).
- Strong problem solving and analytical skills.
- Good communication of spoken and written English.
- Nice to have good knowledge of ones of these: Java, Spring Boot, Node Js, SQL, JavaScript, HTML/CSS, XSS, REST API, Git, Jenkins, Apache, Linux, Docker, Swagger.
- Nice to have experience in agile development life cycle, sprint review, sprint planning and code reviews.
- LSEG is a leading global financial markets infrastructure and data provider. Our purpose is driving financial stability, empowering economies and enabling customers to create sustainable growth. Our purpose is the foundation on which our culture is built. Our values of Integrity, Partnership, Excellence and Change underpin our purpose and set the standard for everything we do, every day. They go to the heart of who we are and guide our decision making and everyday actions. Working with us means that you will be part of a dynamic organisation of 25,000 people across 65 countries. However, we will value your individuality and enable you to bring your true self to work so you can help enrich our diverse workforce. You will be part of a collaborative and creative culture where we encourage new ideas and are committed to sustainability across our global business. You will experience the critical role we have in helping to re-engineer the financial ecosystem to support and drive sustainable economic growth. Together, we are aiming to achieve this growth by accelerating the just transition to net zero, enabling growth of the green economy and creating inclusive economic opportunity. LSEG offers a range of tailored benefits and support, including healthcare, retirement planning, paid volunteering days and wellbeing initiatives. We are proud to be an equal opportunities employer. This means that we do not discriminate on the basis of anyone s race, religion, colour, national origin, gender, sexual orientation, gender identity, gender expression, age, marital status, veteran status, pregnancy or disability, or any other basis protected under applicable law. Conforming with applicable law, we can reasonably accommodate applicants' and employees' religious practices and beliefs, as well as mental health or physical disability needs. Please take a moment to read this privacy notice carefully, as it describes what personal information London Stock Exchange Group (LSEG) (we) may hold about you, what it s used for, and how it s obtained, your rights and how to contact us as a data subject. If you are submitting as a Recruitment Agency Partner, it is essential and your responsibility to ensure that candidates applying to LSEG are aware of this privacy notice.
6 days ago
See morekeyboard_arrow_down
SAVE JOB
UNSAVE JOB
Skills:
Compliance, Power BI, Tableau, English
Job type:
Full-time
Salary:
negotiable
- Architect scalable BI solutions that meet strategic business needs within a modern cloud data platform environment.
- Oversee the development of cutting-edge BI systems, ensuring they align with organizational goals and industry best practices.
- Implement robust security measures to protect sensitive data and ensure compliance with relevant regulations.
- Optimize BI systems for cost-effectiveness, balancing expenses with performance and functionality.
- Fine-tune BI architectures to deliver high performance, enabling quick data access and analysis for thousands of users.
- Collaborate with cross-functional teams to understand and address diverse business intelligence requirements.
- Design and implement BI best practices, guidelines, and processes to drive self-service BI adoption across the company.
- Continuously evaluate and integrate new technologies, including machine learning and AI, to keep the BI infrastructure at the forefront of innovation.
- BenefitsExtensive experience in designing and implementing cloud-based Business Intelligence (BI) systems, with expertise in Power BI or Tableau, within modern cloud data platform environments.
- Strong technical proficiency in cloud technologies, particularly Azure, AWS, or GCP, for building and managing scalable BI infrastructures.
- In-depth knowledge of data security practices and compliance requirements for enterprise-level BI systems, including cloud-specific security measures.
- Proven experience in cost optimization and management for large-scale cloud BI infrastructures, leveraging cloud-native tools and best practices.
- Expertise in big data technologies such as Apache Spark and familiarity with platforms like Databricks for handling and processing large datasets in BI contexts.
- Knowledge of machine learning and AI integration within BI solutions to drive data-driven insights.
- Strong understanding of data governance frameworks, data lineage, and metadata management.
- Expertise in implementing data quality and master data management (MDM) solutions.
- Excellent leadership and project management skills, with the ability to oversee complex BI development projects and guide cross-functional teams.
- Strong collaboration and communication skills, including fluency in both verbal and written English, to effectively work with diverse stakeholders and articulate technical concepts clearly.
- A track record of staying current with emerging BI technologies and successfully integrating them into existing architectures, while developing and implementing best practices for self-service BI adoption.
16 days ago
See morekeyboard_arrow_down
SAVE JOB
UNSAVE JOB
Experience:
6 years required
Skills:
Big Data, Good Communication Skills, Scala
Job type:
Full-time
Salary:
negotiable
- Collate technical and functional requirements through workshops with senior stakeholders in risk, actuarial, pricing and product teams.
- Translate business requirements to technical solutions leveraging strong business acumen.
- Analyse current business practice, processes, and procedures as well as identifying future business opportunities for leveraging Data & Analytics solutions on various platforms.
- Develop solution proposals that provide details of project scope, approach, deliverables and project timeline.
- Provide architectural expertise to sales, project and other analytics teams.
- Identify risks, assumptions, and develop pricing estimates for the Data & Analytics solutions.
- Provide solution oversight to delivery architects and teams.
- Skills and attributes for success.
- 6-8 years of experience in Big Data, data warehouse, data analytics projects, and/or any Information Management related projects.
- Prior experience building large scale enterprise data architectures using commercial and/or open source Data Analytics technologies.
- Ability to estimate complexity, effort and cost.
- Ability to produce client ready solution architecture, business understandable presentations and good communication skills to lead and run workshops.
- Strong knowledge of data manipulation languages such as Spark, Scala, Impala, Hive SQL, Apache Nifi and Kafka necessary to build and maintain complex queries, streaming and real-time data pipelines.
- Data modelling and architecting skills including strong foundation in data warehousing concepts, data normalisation, and dimensional data modelling such as OLAP, or data vault.
- Good fundamentals around security integration including Kerberos authentication, SAML and data security and privacy such as data masking and tokenisation techniques.
- Good knowledge in DevOps engineering using Continuous Integration/ Delivery tools.
- An in depth understanding of Cloud solutions (AWS, Azure and/or GCP) and experienced in integrating into traditional hosting/delivery models.
- Ideally, you ll also have.
- Experience in engaging with both technical and non-technical stakeholders.
- Strong consulting experience and background, including engaging directly with clients.
- Demonstrable Cloud experience with Azure, AWS or GCP.
- Configuration and management of databases.
- Experience with big data tools such as Hadoop, Spark, Kafka.
- Experience with AWS and MS cloud services.
- Python, SQL, Java, C++, Scala.
- Highly motivated individuals with excellent problem-solving skills and the ability to prioritize shifting workloads in a rapidly changing industry. An effective communicator, you ll be a confident leader equipped with strong people management skills and a genuine passion to make things happen in a dynamic organization.
- What working at EY offers.
- Support, coaching and feedback from some of the most engaging colleagues around.
- Opportunities to develop new skills and progress your career.
- The freedom and flexibility to handle your role in a way that s right for you.
- about EY
- As a global leader in assurance, tax, transaction and advisory services, we hire and develop the most passionate people in their field to help build a better working world. This starts with a culture that believes in giving you the training, opportunities and creative freedom to make things better. So that whenever you join, however long you stay, the exceptional EY experience lasts a lifetime.
- If you can confidently demonstrate that you meet the criteria above, please contact us as soon as possible.
- Join us in building a better working world. Apply now!.
11 days ago
See morekeyboard_arrow_down
SAVE JOB
UNSAVE JOB
Skills:
ETL, SQL, Hadoop
Job type:
Full-time
Salary:
negotiable
- Conduct meeting with users to understand the data requirements and perform database design based on data understanding and requirements with consideration for performance.
- Maintain data dictionary, relationship and its interpretation.
- Analyze problem and find resolution, as well as work closely with administrators to monitor performance and advise any necessary infrastructure changes.
- Work with business domain experts, data scientists and application developers to identify data that is relevant for analysis.
- Develop big data solutions for batch processing and near real-time streaming.
- Own end-to-end data ETL/ELT process framework from Data Source to Data warehouse.
- Select and integrate appropriate tools and frameworks required to provide requested capabilities.
- Design and develop BI solutions.
- Hands-on development mentality, with a willingness to troubleshoot and solve complex problems.
- Keep abreast of new developments in the big data ecosystem and learn new technologies.
- Ability to effectively work independently and handle multiple priorities.
- Master degree or higher in Computer Science, Computer Engineering, Information Technology, Management Information System or an IT related field.
- 3+ year's experience in Data Management or Data Engineer (Retail or E-Commerce business is preferable).
- Expert experience in query language (SQL), Databrick SQL, PostgreSQL.
- Experience in BigData Technologies like Hadoop, Apache Spark, Databrick.
- Experience in Python is a must.
- Knowledge in machine/statistical learning, data mining is a plus.
- Experience in Generative AI is a plus.
- Strong analytical, problem solving, communication and interpersonal skills.
- Having good attitude toward team working and willing to work hard.
- CP AXTRA | Lotus's
- CP AXTRA Public Company Limited.
- Nawamin Office: Buengkum, Bangkok 10230, Thailand.
- By applying for this position, you consent to the collection, use and disclosure of your personal data to us, our recruitment firms and all relevant third parties for the purpose of processing your application for this job position (or any other suitable positions within Lotus's and its subsidiaries, if any). You understand and acknowledge that your personal data will be processed in accordance with the law and our policy. .
Today
See morekeyboard_arrow_down
SAVE JOB
UNSAVE JOB
Skills:
Big Data, ETL, SQL
Job type:
Full-time
Salary:
negotiable
- Develop and maintain robust data pipelines to ingest, process, and transform raw data into formats suitable for LLM training.
- Conduct meeting with users to understand the data requirements and perform database design based on data understanding and requirements with consideration for performance.
- Maintain data dictionary, relationship and its interpretation.
- Analyze problem and find resolution, as well as work closely with administrators to monitor performance and advise any necessary infrastructure changes.
- Work with business domain experts, data scientists and application developers to identify data that is relevant for analysis.
- Develop big data solutions for batch processing and near real-time streaming.
- Own end-to-end data ETL/ELT process framework from Data Source to Data warehouse.
- Select and integrate appropriate tools and frameworks required to provide requested capabilities.
- Design and develop BI solutions.
- Hands-on development mentality, with a willingness to troubleshoot and solve complex problems.
- Keep abreast of new developments in the big data ecosystem and learn new technologies.
- Ability to effectively work independently and handle multiple priorities.
- Bachelor degree or higher in Computer Science, Computer Engineering, Information Technology, Management Information System or an IT related field.
- 3+ year's experiences in Data Management or Data Engineer (Retail or E-Commerce business is preferrable).
- Expert experience in query language (SQL), Databrick SQL, PostgreSQL.
- Experience in Big Data Technologies like Hadoop, Apache Spark, Databrick.
- Experience in Python is a must.
- Experience in Generative AI is a must.
- Knowledge in machine/statistical learning, data mining is a plus.
- Strong analytical, problem solving, communication and interpersonal skills.
- Having good attitude toward team working and willing to work hard.
- CP AXTRA | Lotus's
- CP AXTRA Public Company Limited.
- Nawamin Office: Buengkum, Bangkok 10230, Thailand.
- By applying for this position, you consent to the collection, use and disclosure of your personal data to us, our recruitment firms and all relevant third parties for the purpose of processing your application for this job position (or any other suitable positions within Lotus's and its subsidiaries, if any). You understand and acknowledge that your personal data will be processed in accordance with the law and our policy. .
6 days ago
See morekeyboard_arrow_down
SAVE JOB
UNSAVE JOB
Send me latest jobs forApache
- 1