- No elements found. Consider changing the search query.
Skills:
SQL, NoSQL
Job type:
Full-time
Salary:
negotiable
- Shape the future of robust and scalable backend systems that power critical financial applications on a multi dealer FX Platform!.
- You will work as a BA on either buy-side, core or sell-side platform teams.
- Utilise your domain knowledge to write requirements for the development team you work with.
- Build positive relationship with Product team to understand the challenges that we are trying to solve and provide effective solutions!.
- Provide requirement elaboration and participate in issue triages.
- Qualifications/Requirements Proven Experience Excellent understanding of trading systems on FX products (spot, forwards, swaps and options) including non vanilla products.
- Experience with market data, trade pricing, etc.
- Good understanding of fix protocol.
- Understanding of Post trade processes and pre-trade checks.
- Ability to converse fluently on above with Senior Product Management on above topics.
- Technology Understanding of cloud-native application architectures.
- Experience with database design and management (SQL and NoSQL databases).
- Process Understands Agile process and metrics related to monitor the team.
- Excellent at recognising potential improvement in processes.
- Skills Required Pro-active self-starter, able to tackle high-level tasks with minimal direction; also to formulate, refine and propose new insights.
- Can dive into the detail when needed to tackle problems and can also ask the right questions to complete the picture.
- Must have excellent verbal and written communication skills, including the ability to present sophisticated subjects clearly and to make them accessible to others.
- Produce detailed, unambiguous user stories so that developers and QA are working with the same understanding.
- Ability to clearly articulate non-functional requirements e.g. response time, throughput, resilience; experience with low latency, high availability platforms would be helpful.
- Partnering with global teams including Product, Architecture and Development are key to this role.
- You will use Agile concepts like backlog refinement, MVP and feature increments to help lead the flow of work and also to handle expectations.
- Education - Bachelors or Masters Degree.
- Benefits Work with world class global and diverse engineering team located in Singapore, Bangkok, London, New york and Bucharest.
- We are a diverse and inclusive organization.
- LSEG is a leading global financial markets infrastructure and data provider. Our purpose is driving financial stability, empowering economies and enabling customers to create sustainable growth. Our purpose is the foundation on which our culture is built. Our values of Integrity, Partnership, Excellence and Change underpin our purpose and set the standard for everything we do, every day. They go to the heart of who we are and guide our decision making and everyday actions. Working with us means that you will be part of a dynamic organisation of 25,000 people across 65 countries. However, we will value your individuality and enable you to bring your true self to work so you can help enrich our diverse workforce. You will be part of a collaborative and creative culture where we encourage new ideas and are committed to sustainability across our global business. You will experience the critical role we have in helping to re-engineer the financial ecosystem to support and drive sustainable economic growth. Together, we are aiming to achieve this growth by accelerating the just transition to net zero, enabling growth of the green economy and creating inclusive economic opportunity. LSEG offers a range of tailored benefits and support, including healthcare, retirement planning, paid volunteering days and wellbeing initiatives. We are proud to be an equal opportunities employer. This means that we do not discriminate on the basis of anyone s race, religion, colour, national origin, gender, sexual orientation, gender identity, gender expression, age, marital status, veteran status, pregnancy or disability, or any other basis protected under applicable law. Conforming with applicable law, we can reasonably accommodate applicants' and employees' religious practices and beliefs, as well as mental health or physical disability needs. Please take a moment to read this privacy notice carefully, as it describes what personal information London Stock Exchange Group (LSEG) (we) may hold about you, what it s used for, and how it s obtained, your rights and how to contact us as a data subject. If you are submitting as a Recruitment Agency Partner, it is essential and your responsibility to ensure that candidates applying to LSEG are aware of this privacy notice.
Experience:
5 years required
Skills:
Java, MongoDB, NoSQL, English
Job type:
Full-time
Salary:
negotiable
- You have a good understanding of Object-Oriented Programming concepts.
- You have experience working on Distributed Systems / Microservices.
- You should be able to produce clean, efficient code based on specifications.
- You will be working with an existing product.
- Knowledge of and adherence to best practices for writing maintainable code and unit-testing is a must.
- You possess analytical and problem-solving skills.
- You should be able to work independently as a contributing member in a high-paced and focused team.
- Bachelor's Degree in Computer Science or Information Technology, or equivalent experience.
- At least five years of experience writing programs in C#.NET Framework or Java.
- At least two years of experience working with MongoDB or other NoSQL databases.
- Able to learn and understand various API services.
- Self-motivated, eager to solve problems, driven to completion, and willing to work with others. We encourage pair programming and require collaboration on design, code reviews, and testing.
- A hybrid engineer, capable of designing and implementing your own code as well as reviewing, testing, and writing test automation for other engineers' code.
- Enjoys exploring new technologies and programming techniques, with a "willing to learn" attitude.
- Fluent in written and spoken English.
- This role is open for both Thai and non-Thai candidates. We can provide full VISA sponsorship if required.
Skills:
Cloud Computing, SAP, Linux
Job type:
Full-time
Salary:
negotiable
- Acting as the key of Cloud technical aspect for the consulting team to provide the technical consulting to both internal and external customers.
- Design Cloud solution architecture in response to the client s requirement.
- Provide advisory consulting service to the client regarding the True IDC Consulting practices.
- Create Cloud technical requirements to the client s migration plan.
- Experience of designing and implementing comprehensive Cloud computing solutions on various Cloud technologies e.g. AWS, GCP.
- Experience in building multi-tier Service Oriented Architecture (SOA) applications.
- Experience in SAP Cloud Infrastructure in term of architecture & design in AWS, GCP public cloud.
- Knowledge of Linux, Windows, Apache, IIS, NoSQL operations as its architecture toth e Cloud.
- Knowledge of Containerization administrative for both Windows and Linux technologies.
- Knowledge of key concerns and how they are addressed in Cloud Computing such as security, performance and scalability.
- Good in customer objective handling & Good in customer presentation skill.
- Nice to have.
- UNIX shell scripting.
- AWS Certified Solution Architect - Associate.
- GCP Certified Solution Architect - Associate.
Skills:
Scrum, Python, Amazon AWS
Job type:
Full-time
Salary:
negotiable
- Collaborate with Product Managers, UX team, and Software Engineers around the globe to deliver outstanding products.
- Through participation in refinement and planning sessions, you'll work with other team members to analyze development requirements, provide design options and complexity estimates, and agree how to deliver the requirements.
- Develop and maintain enterprise software, adhering to company standards and established software methodology.
- Identify and resolve performance and stability issues.
- Mentor junior engineers on good software development principles.
- Required Skills Experience 4+ years as a Software Developer.
- TypeScript, Node.js.
- API design (REST, GraphQL).
- Strong technical background with understanding of programming styles, frameworks, design patterns and unit testing.
- Practical experience with cloud-native application development with one of the major cloud providers (AWS, GCP, Azure).
- Excellent problem solving and communication skill.
- Desired Skills Experience in crafting scalable and high-performance NoSQL database.
- Experience in Infrastructure as Code (Terraform) and AWS/Azure server-less technology.
- Knowledge of basic security concepts such as authentication, authorization, OIDC, OAuth.
- LSEG is an equal opportunties employer, that seeks to offer an inclusive environment to all colleagues. Furthermore, LSEG has committed to reduce its carbon emissions by 50% by 2030, and to reach net zero by 2040 LSEG is a leading global financial markets infrastructure and data provider. Our purpose is driving financial stability, empowering economies and enabling customers to create sustainable growth. Our purpose is the foundation on which our culture is built. Our values of Integrity, Partnership, Excellence and Change underpin our purpose and set the standard for everything we do, every day. They go to the heart of who we are and guide our decision making and everyday actions. Working with us means that you will be part of a dynamic organisation of 25,000 people across 65 countries. However, we will value your individuality and enable you to bring your true self to work so you can help enrich our diverse workforce. You will be part of a collaborative and creative culture where we encourage new ideas and are committed to sustainability across our global business. You will experience the critical role we have in helping to re-engineer the financial ecosystem to support and drive sustainable economic growth. Together, we are aiming to achieve this growth by accelerating the just transition to net zero, enabling growth of the green economy and creating inclusive economic opportunity. LSEG offers a range of tailored benefits and support, including healthcare, retirement planning, paid volunteering days and wellbeing initiatives. We are proud to be an equal opportunities employer. This means that we do not discriminate on the basis of anyone s race, religion, colour, national origin, gender, sexual orientation, gender identity, gender expression, age, marital status, veteran status, pregnancy or disability, or any other basis protected under applicable law. Conforming with applicable law, we can reasonably accommodate applicants' and employees' religious practices and beliefs, as well as mental health or physical disability needs. Please take a moment to read this privacy notice carefully, as it describes what personal information London Stock Exchange Group (LSEG) (we) may hold about you, what it s used for, and how it s obtained, your rights and how to contact us as a data subject. If you are submitting as a Recruitment Agency Partner, it is essential and your responsibility to ensure that candidates applying to LSEG are aware of this privacy notice.
Skills:
SQL, Research, Java
Job type:
Full-time
Salary:
negotiable
- Background in SQL, databases and/or data science OR.
- BS/MS in software engineering, computer science, mathematics.
- Document data sources in enterprise data catalog with metadata, lineage and classification information.
- Develop aggregations and algorithms needed for reporting and analytics with low level complexity.
- Implement minor changes to existing data visualization applications, reporting dashboards.
- Document modifications to reporting applications based on modifications applied.
- Comprehend and adhere to all data security policies and procedures.
- Create data tools for analytics and data scientist team members.
- Build analytical tools to provide actionable insights into key business KPIs, etc.
- Work with data engineers to optimize pipelines for scalability and data delivery.
- Functional Competency.
- Working knowledge with data and analytics framework supporting data lakes, warehouses, marts, reporting, etc.
- Experience with data tools for visualizations, analytics and reporting.
- Strong analytical skills with ability to research, assess and develop observations/findings.
- Ability to communicate findings, approaches to cross functional teams and stakeholders.
- 3+ years' hands-on experience with a data science background.
- Some programming skills in Java, Python and SQL.
- Clear hands-on experience with database systems - Cloud technologies (e.g. AWS, Azure, Google), in-memory database systems (e.g. HANA, Hazel cast, etc) and other database systems - traditional RDBMS (e.g. Teradata, SQL Server, Oracle), and NoSQL databases (e.g. Cassandra, MongoDB, DynamoDB).
- Educational.
- Background in SQL, databases and/or data science OR.
- BS/MS in software engineering, computer science, mathematics.
- Document data sources in enterprise data catalog with metadata, lineage and classification information.
- Develop aggregations and algorithms needed for reporting and analytics with low level complexity.
- Implement minor changes to existing data visualization applications, reporting dashboards.
- Document modifications to reporting applications based on modifications applied.
- Comprehend and adhere to all data security policies and procedures.
- Create data tools for analytics and data scientist team members.
- Build analytical tools to provide actionable insights into key business KPIs, etc.
- Work with data engineers to optimize pipelines for scalability and data delivery.
- Functional Competency.
- Working knowledge with data and analytics framework supporting data lakes, warehouses, marts, reporting, etc.
- Experience with data tools for visualizations, analytics and reporting.
- Strong analytical skills with ability to research, assess and develop observations/findings.
- Ability to communicate findings, approaches to cross functional teams and stakeholders.
- 3+ years' hands-on experience with a data science background.
- Some programming skills in Java, Python and SQL.
- Clear hands-on experience with database systems - Cloud technologies (e.g. AWS, Azure, Google), in-memory database systems (e.g. HANA, Hazel cast, etc) and other database systems - traditional RDBMS (e.g. Teradata, SQL Server, Oracle), and NoSQL databases (e.g. Cassandra, MongoDB, DynamoDB).
Experience:
5 years required
Skills:
Java, TypeScript, Javascript, English
Job type:
Full-time
Salary:
negotiable
- Planning and delivering proactive and reactive support including onsite presence as needed (post Covid restrictions).
- You will work with a larger customer account team to strengthen customer relationships and to work on Microsoft cloud and application innovation strategies that allow you to develop an immediate and long-term Customer Success Plan and Value Based Delivery for reactive and proactive needs.
- You will Identify and manage customer goals and SfMC opportunities across Azure PaaS ...
- You will drive and participate in proactive delivery management as well as spot performance issues, analyze problems, and drive activities focused on stabilizing and optimizing your customer s solution.
- You will work with internal Microsoft support teams, account teams, product engineering and service engineering teams and other stakeholders to ensure a streamlined and efficient customer support experience.
- You will apply and share lessons learned for continuous process and delivery improvement for the customer and peers.
- You will engage in meetings with your customers and account teams to review Support for Mission Critical services, customer support issues, and articulate your Customer Success Plans.
- You will share and gain knowledge through technical communities.
- You will contribute to on-call rotations to ensure a high quality of service for the critical incidents created by Support for Mission Critical customers.
- Required Qualifications5+ years technical engineering experience with coding in languages including, but not limited to, C#, Java, Typescript, JavaScript, or Python OR equivalent experience.
- 5+ years experience with architecting and building large-scale complex enterprise services on cloud platforms such as Azure or AWS.
- 5+ years of Azure PaaS related experience. Breadth of technical experience and knowledge, with depth / Subject Matter.
- Expertise in the following Azure PaaS areas is expected.
- o Azure Container Services, including AKS, ACA, ACI etc.
- o Azure API Management
- o Azure App Service
- o Azure Function
- o Application DevelopmentStrong knowledge and experience in designing and implementing Azure PaaS solutions.
- Experience in designing, implementing, and shipping complex enterprise software products/services.
- Hands-on ability to write secure, reliable, and maintainable code and to test and debug.
- Familiar with Microsoft cloud adoption framework, Well-architected framework.
- Strong knowledge and experience in.NET Core,.NET Framework and C#.
- Strong knowledge and experience in DevSecOps and Site Reliability Engineering.
- Strong knowledge and experience in Application monitoring and familiar with products/services like application insights, Prometheus, Grafana etc.
- Strong knowledge and experience in microservice design.
- Strong knowledge and experience in container ecosystems, including Kubernetes, docker, service mesh etc.
- Proficiency in utilizing GitHub and/or Azure DevOps Service.
- Ability to operate and be successful in a highly ambiguous, rapidly evolving environment.
- Azure IaaS.
- Strong knowledge and experience in designing and implementing Azure storage solutions.
- Strong knowledge and experience in designing and implementing Azure networking solutions.
- Strong knowledge and experience in designing and implementing Azure compute solution.
- Strong knowledge in high availability and disaster recovery features for IaaS components.
- Familiar with Azure security services/features.
- Strong knowledge and experience in building infra-as-code solution by using products like Azure Bicep, Azure arm template, terraform, ansible etc.
- Language QualificationThai Language: confident in reading and writing and moderate speaking.
- English Language: fluent in reading and writing and speaking.
- Other QualificationsExperience in systems management, network operations, software support, IT consulting, or related roles.
- Solid understanding of client/server, networking, and Internet technologies fundamentals.
- Must have outstanding customer service skills with excellent oral and written communication skills as well as experience providing training to peers or customers.
- Must demonstrate strong interpersonal and leadership skills while working with diverse audiences including highly technical IT professionals, engineers, developers, and architects as well as executives and management professionals in both customer and Microsoft teams.
- Must have experience leading and driving projects as well as motivating others.
- Must be self-motivated, resourceful and able to handle multiple responsibilities as a Microsoft Cloud Solution Architect and Support for Mission Critical professional.
- Need to demonstrate the ability to develop strong strategic customer relationships that gain the trust and respect of customers.
- Need the ability to handle critical technical issues and work in difficult support situations.
- Need a proven ability to handle difficult or sensitive situations with exasperated customers.
- Certification in Microsoft and other Cloud Technologies.
- Below is a plusKnowledge and experience in multi-cloud platforms like AWS, GCP etc.
- Knowledge and experience in Microsoft Coplot products, AI, and Machine Learning.
- Knowledge and experience in Azure relational databases and NoSQL database services.
- Knowledge and experience in open-source technology like LAMP.
- Microsoft is an equal opportunity employer. Consistent with applicable law, all qualified applicants will receive consideration for employment without regard to age, ancestry, citizenship, color, family or medical care leave, gender identity or expression, genetic information, immigration status, marital status, medical condition, national origin, physical or mental disability, political affiliation, protected veteran or military status, race, ethnicity, religion, sex (including pregnancy), sexual orientation, or any other characteristic protected by applicable local laws, regulations and ordinances. If you need assistance and/or a reasonable accommodation due to a disability during the application process, read more about requesting accommodations.
Skills:
Java, Spring Boot, Kubernetes
Job type:
Full-time
Salary:
negotiable
- Work in an agile team to build/develop features and technologies across various aspects of the Java stack, primarily focused on Spring Boot and Spring Cloud/NetflixOSS.
- CI/CD deployments on a Kubernetes-based platform, both on-premises and on multi-cloud infrastructure. (AWS and GCP).
- Possess an understanding of cloud-native architectures and be familiar with implementations involving service discovery, circuit breakers, client-side load balancing, and other architectural patterns related to elastic infrastructure.
- Participate in, and help create a company culture that attracts, retains, and coaches other engineers. The primary deliverable of a senior engineer is more senior engineers.
- Conduct design and code reviews.
- Provide specific technical expertise to help drive innovation.
- Identify emerging technologies to create leading-edge banking products.
- Partnering with architects and platform engineers to build strategies for execution, drive and facilitate key decisions, and influence others, and lead change where appropriate.
- A positive, can-do attitude, who naturally expresses a high degree of empathy to others.
- Bachelor s Degree in Computer Science or equivalent work experience.
- Relevant work experience. Or 3+ years for senior position.
- Experience in building complex applications from scratch and decomposing monolithic applications into micro-services.
- Minimum of core Java 8, Spring Boot, Spring Cloud.
- Kubernetes (or Docker/ Mesos and equivalent).
- MySQL, PostgreSQL, EnterpriseDB, NoSQL (Cassandra, MongoDB).
- RabbitMQ, Kafka.
- AWS & GCP.
- API Gateway.
- Linux.
- CI/CD (Jenkins, Git).
- React.JS (Optional).
- Experience with distributed architectures, SOA, microservices, and Platform-as-a-service (PaaS).
- Experience with Agile and Test-Driven Development (TDD) methodologies.
- Experience with high availability, high-scale, and performance systems.
- Experience in Automation testing/ or Unit testing is a plus.
- Location: True Digital Park, Bangkok.
Skills:
Automation, Product Owner, Python
Job type:
Full-time
Salary:
negotiable
- The candidate will be responsible for design, and implement new solutions for complex data ingestions from multiple sources to enterprise data products with focus on automation, performance, resilience and scalability, etc.
- Partner with Lead Architect, Data Product Manager (Product Owner) and Lead Data Integration Engineer to create strategic solutions introducing new technologies.
- Work with stakeholders including Management, Domain leads, and Teams to assist with data-related technical issues and support their data infrastructure needs.
- Strong development & programming experience in Informatica (IICS), Python, ADF, Azure Synapse, snowflake, Cosmos and Databricks.
- Solid understanding of databases, real-time integration patterns and ETL/ELT best practices.
- Defining data retention policies, monitoring performance and advising any necessary infrastructure changes based on functional and non-functional requirements.
- Responsible for ensuring enterprise data policies, best practices, standards and processes are followed.
- Write up and maintain technical specifications, design documents and process flow.
- Mentor a team of onshore and offshore development resources to analyze, design, construct and test software development projects focused on analytics and data integration.
- Elaborate user stories for technical team and ensure that the team understands the deliverables.
- Effectively communicate, coordinate & collaborate with business, IT architecture and data teams across multi-functional areas to complete deliverables.
- Provide direction to the Agile development team and stakeholders throughout the project.
- Assist in Data Architecture design, tool selection and data flows analysis.
- Work with large amounts of data, interpret data, analyze results, perform gap analysis and provide ongoing reports.
- Handle ad-hoc analysis & report generation requests from the business.
- Respond to data related inquiries to support business and technical teams.
- 6+years of proven working experience in ETL methodologies, Data integration and data migration. Informatica IICS, Databricks/Spark & Python hands-on development skills a must.
- Clear hands-on experience with database systems - SQL server, Oracle, Azure Synapse, Snowflake and Cosmos, Cloud technologies (e.g., AWS, Azure, Google), and NoSQL databases (e.g., Cosmos, MongoDB, DynamoDB).
- Extensive experience developing complex solutions focused on data ecosystem solutions.
- Extensive knowledge of data and analytics framework supporting data lakes, warehouses, marts, reporting, etc.
- In depth knowledge of data engineering and architecture disciplines.
- Extensive experience working with Big Data tools and building data solutions for advanced analytics.
- Solid understanding of P&C Insurance data.
- Technical expertise regarding data architecture, models and database design development.
- Strong knowledge of and experience with Java, SQL, XML s, Python, ETL frameworks and Databricks.
- Working knowledge/familiarity with Git version control.
- Strong Knowledge of analyzing datasets using Excel.
- Strong analytical skills with the ability to collect, organize, analyze, and disseminate significant amounts of information with attention to detail and accuracy.
- Proficient in learning new technologies with the ability to quickly understand capabilities and work with others to guide these into development.
- Good communication and presentation skills.
- Solid problem solving, decision making and analytical skills.
- Knowledge & working experience with Duck creek is an added plus.
- Knowledge & working experience with Insurity Policy Decisions and/or IEV is an added plus.
- Experience with JIRA.
- Experience being part of high-performance agile teams in a fast-paced environment.
- Must understand the system scope and project objectives to achieve project needs through matrix management and collaboration with other enterprise teams.
- Proven ability to produce results in the analysis, design, testing and deployment of applications.
- Strong team emphasis and relationship building skills; partners well with business and other IT/Data areas.
- Strong coaching / mentoring skills.
- Applies technical knowledge to determine solutions and solve complex problems.
- Ability to be proactive, self-motivated, detail-oriented, creative, inquisitive and persistent.
- Excellent communication and negotiation skills.
- Ability to organize, plan and implement work assignments, juggle competing demands and work under pressure of frequent and tight deadlines.
- The candidate will be responsible for design, and implement new solutions for complex data ingestions from multiple sources to enterprise data products with focus on automation, performance, resilience and scalability, etc.
- Partner with Lead Architect, Data Product Manager (Product Owner) and Lead Data Integration Engineer to create strategic solutions introducing new technologies.
- Work with stakeholders including Management, Domain leads, and Teams to assist with data-related technical issues and support their data infrastructure needs.
- Strong development & programming experience in Informatica (IICS), Python, ADF, Azure Synapse, snowflake, Cosmos and Databricks.
- Solid understanding of databases, real-time integration patterns and ETL/ELT best practices.
- Defining data retention policies, monitoring performance and advising any necessary infrastructure changes based on functional and non-functional requirements.
- Responsible for ensuring enterprise data policies, best practices, standards and processes are followed.
- Write up and maintain technical specifications, design documents and process flow.
- Mentor a team of onshore and offshore development resources to analyze, design, construct and test software development projects focused on analytics and data integration.
- Elaborate user stories for technical team and ensure that the team understands the deliverables.
- Effectively communicate, coordinate & collaborate with business, IT architecture and data teams across multi-functional areas to complete deliverables.
- Provide direction to the Agile development team and stakeholders throughout the project.
- Assist in Data Architecture design, tool selection and data flows analysis.
- Work with large amounts of data, interpret data, analyze results, perform gap analysis and provide ongoing reports.
- Handle ad-hoc analysis & report generation requests from the business.
- Respond to data related inquiries to support business and technical teams.
- 6+years of proven working experience in ETL methodologies, Data integration and data migration. Informatica IICS, Databricks/Spark & Python hands-on development skills a must.
- Clear hands-on experience with database systems - SQL server, Oracle, Azure Synapse, Snowflake and Cosmos, Cloud technologies (e.g., AWS, Azure, Google), and NoSQL databases (e.g., Cosmos, MongoDB, DynamoDB).
- Extensive experience developing complex solutions focused on data ecosystem solutions.
- Extensive knowledge of data and analytics framework supporting data lakes, warehouses, marts, reporting, etc.
- In depth knowledge of data engineering and architecture disciplines.
- Extensive experience working with Big Data tools and building data solutions for advanced analytics.
- Solid understanding of P&C Insurance data.
- Technical expertise regarding data architecture, models and database design development.
- Strong knowledge of and experience with Java, SQL, XML s, Python, ETL frameworks and Databricks.
- Working knowledge/familiarity with Git version control.
- Strong Knowledge of analyzing datasets using Excel.
- Strong analytical skills with the ability to collect, organize, analyze, and disseminate significant amounts of information with attention to detail and accuracy.
- Proficient in learning new technologies with the ability to quickly understand capabilities and work with others to guide these into development.
- Good communication and presentation skills.
- Solid problem solving, decision making and analytical skills.
- Knowledge & working experience with Duck creek is an added plus.
- Knowledge & working experience with Insurity Policy Decisions and/or IEV is an added plus.
- Experience with JIRA.
- Experience being part of high-performance agile teams in a fast-paced environment.
- Must understand the system scope and project objectives to achieve project needs through matrix management and collaboration with other enterprise teams.
- Proven ability to produce results in the analysis, design, testing and deployment of applications.
- Strong team emphasis and relationship building skills; partners well with business and other IT/Data areas.
- Strong coaching / mentoring skills.
- Applies technical knowledge to determine solutions and solve complex problems.
- Ability to be proactive, self-motivated, detail-oriented, creative, inquisitive and persistent.
- Excellent communication and negotiation skills.
- Ability to organize, plan and implement work assignments, juggle competing demands and work under pressure of frequent and tight deadlines.
Experience:
6 years required
Skills:
Industry trends, Java, PHP
Job type:
Full-time
Salary:
negotiable
- Customer First Mindset - Engage with and enable our customers and key decision-makers, delivering a connected customer engagement experience and driving customer satisfaction, through digital sales excellence, empowered by world-class data, marketing systems and platforms.
- Be the key trusted advisor and influencer in shaping customer decisions to buy and adopt Microsoft Azure solutions by winning the customers technical decision for consumption projects and usage scenarios through tailored messaging, technical discussion ...
- Collaborates with Digital Specialists, extended sales team, partners to conduct business analysis (e.g., whitespace analysis, identify industry trends) to pursue high-potential customers and develop a target list of potential business. Elevate team capabilities and focus on working smarter and more effectively. Prioritizing time with customers and partners, leveraging tools and processes to run and grow the business and build a stronger team.
- Lead technical demonstrations of Azure solutions to explain and prove the capabilities of Microsoft Azure relative to the customers business and technical objectives. Collaborates with account teams, partners, or services to track, qualify, and expand new opportunities. Collaborates with other teams (e.g., account teams) and services to build pipeline. Interfaces with customers and builds relationships via social selling. Applies Microsoft's sales to determine the quality of the opportunity and whether to proceed.
- Engages in conversations with customers to introduce how other workloads could enable digital transformation areas that is aligned with the customer's industry and turns opportunities into deals. Has a deep understanding of customers' business and its priorities to drive conversations with customers on digital transformation across multiple solution areas, in collaboration with partners and services. Creates guiding examples of digital transformation through seminars, workshops, Webinars, and direct engagement.
- Build relationships with leadership and field stakeholders to enable team success across internal and external stakeholders. Collaborates with account teams (e.g., Account Executives) to identify and engage senior business subject matter decision makers at the customer's/partner's business and maximize scale through partners; work with technical specialist/CSA to secure commitment.
- Applies the orchestration model to proactively drive deal closure by identifying and aligning internal stakeholders and leveraging and expanding relationships with partners, creating demand leading with industry use cases.
- Required/Minimum Qualifications: 6+ years of technology-related sales or account management experience.
- OR Bachelor's Degree in Information Technology, Business Administration, or related field AND 5+ years of technology-related sales or account management experience.
- Additional or Preferred Qualifications: 8+ years of technology-related sales or account management experience.
- OR Bachelor's Degree in Information Technology, or related field AND 6+ years of technology-related sales or account management experience.
- OR Master's Degree in Business Administration (e.g., MBA), Information Technology, or related field AND 5+ years of technology-related sales or account management experience.
- 3+ years of solution sales or consulting services sales experience.
- Subject matter expertise in any of the following is preferred: Understanding in one of the following: Systems Operations / Management - Virtualization; IP Networking; Storage; IT Security.
- IT Infrastructure knowledge.
- Software design or development - languages such as.NET, C++, Java, PHP, Perl, Python, Ruby on Rails or Pig/Hive; Migration virtual machines from private to public cloud environments.
- SQL including OSS (postgres, MySQL etc), Azure SQL.
- NoSQL Databases including OSS (Maria, Mongo etc), Cosmos DB.
- Data Governance.
- Competitive Landscape - Knowledge of cloud development platforms.
- Partners - Understanding of partner ecosystems and the ability to leverage partner solutions to solve customer needs.
- Microsoft is an equal opportunity employer. Consistent with applicable law, all qualified applicants will receive consideration for employment without regard to age, ancestry, citizenship, color, family or medical care leave, gender identity or expression, genetic information, immigration status, marital status, medical condition, national origin, physical or mental disability, political affiliation, protected veteran or military status, race, ethnicity, religion, sex (including pregnancy), sexual orientation, or any other characteristic protected by applicable local laws, regulations and ordinances. If you need assistance and/or a reasonable accommodation due to a disability during the application process, read more about requesting accommodations.
Skills:
Big Data, Java, Python
Job type:
Full-time
Salary:
negotiable
- Background in programming, databases and/or big data technologies OR.
- BS/MS in software engineering, computer science, economics or other engineering fields.
- Partner with Data Architect and Data Integration Engineer to enhance/maintain optimal data pipeline architecture aligned to published standards.
- Assemble medium, complex data sets that meet functional /non-functional business requirements.
- Design and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
- Build the infrastructure required for optimal extraction transformation, and loading of data from a wide variety of data sources big data technologies.
- Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics.
- Work with stakeholders including Domain leads, and Teams to assist with data-related technical issues and support their data infrastructure needs.
- Ensure technology footprint adheres to data security policies and procedures related to encryption, obfuscation and role based access.
- Create data tools for analytics and data scientist team members.
- Functional Competency.
- Knowledge of data and analytics framework supporting data lakes, warehouses, marts, reporting, etc.
- Defining data retention policies, monitoring performance and advising any necessary infrastructure changes based on functional and non-functional requirements.
- In depth knowledge of data engineering discipline.
- Extensive experience working with Big Data tools and building data solutions for advanced analytics.
- Minimum of 5+ years' hands-on experience with a strong data background.
- Solid programming skills in Java, Python and SQL.
- Clear hands-on experience with database systems - Hadoop ecosystem, Cloud technologies (e.g. AWS, Azure, Google), in-memory database systems (e.g. HANA, Hazel cast, etc) and other database systems - traditional RDBMS (e.g. Teradata, SQL Server, Oracle), and NoSQL databases (e.g. Cassandra, MongoDB, DynamoDB).
- Practical knowledge across data extraction and transformation tools - traditional ETL tools (e.g. Informatica, Ab Initio, Altryx) as well as more recent big data tools.
- Educational.
- Background in programming, databases and/or big data technologies OR.
- BS/MS in software engineering, computer science, economics or other engineering fields.
- Partner with Data Architect and Data Integration Engineer to enhance/maintain optimal data pipeline architecture aligned to published standards.
- Assemble medium, complex data sets that meet functional /non-functional business requirements.
- Design and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
- Build the infrastructure required for optimal extraction transformation, and loading of data from a wide variety of data sources big data technologies.
- Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics.
- Work with stakeholders including Domain leads, and Teams to assist with data-related technical issues and support their data infrastructure needs.
- Ensure technology footprint adheres to data security policies and procedures related to encryption, obfuscation and role based access.
- Create data tools for analytics and data scientist team members.
- Functional Competency.
- Knowledge of data and analytics framework supporting data lakes, warehouses, marts, reporting, etc.
- Defining data retention policies, monitoring performance and advising any necessary infrastructure changes based on functional and non-functional requirements.
- In depth knowledge of data engineering discipline.
- Extensive experience working with Big Data tools and building data solutions for advanced analytics.
- Minimum of 5+ years' hands-on experience with a strong data background.
- Solid programming skills in Java, Python and SQL.
- Clear hands-on experience with database systems - Hadoop ecosystem, Cloud technologies (e.g. AWS, Azure, Google), in-memory database systems (e.g. HANA, Hazel cast, etc) and other database systems - traditional RDBMS (e.g. Teradata, SQL Server, Oracle), and NoSQL databases (e.g. Cassandra, MongoDB, DynamoDB).
- Practical knowledge across data extraction and transformation tools - traditional ETL tools (e.g. Informatica, Ab Initio, Altryx) as well as more recent big data tools.
Experience:
5 years required
Skills:
Agile Development, Continuous Integration, Linux, English
Job type:
Full-time
Salary:
negotiable
- You will implement and maintain back-end services and databases to ensure stability, security, and scalability.
- You will design, build, and maintain CI/CD infrastructure.
- You will support project work by updating and releasing to QA/Production with software releases, configuration updates, and other release requirements.
- You will work with Agile development methodology and continuous integration.
- You will learn and share knowledge with the team.
- You will continuously improve the daily work process.
- A Bachelor's Degree in Computer Science or Information Technology, or equivalent experience.
- 5 years of working experience in development and operations, or a related IT, computer, or operations field.
- Strong background in Linux/Unix Administration.
- Experience with container and container management (Docker, Kubernetes, etc.).
- Experience with automation/configuration management (Ansible, Terraform, etc.).
- Experienced in stress/load testing and analyzing performance.
- Knowledge of the C# or Java programming languages.
- Strong experience with SQL and/or NoSQL (SQL, MongoDB, Elasticsearch, Redis, Cassandra, etc.).
- Self-motivated and structured in your way of working.
- Knowledge of best practices and IT operations in 24 7 service.
- Fluent in written and spoken English.
- This role is open for both Thai and non-Thai candidates. We can provide full VISA sponsorship if required.
Skills:
DevOps, Automation, Kubernetes
Job type:
Full-time
Salary:
negotiable
- Managing 7-8 Professional Service Engineers in responsible for AWS cloud solution architecting and implementation/migration according to the project requirements.
- Team resources management.
- Acting as the key of Cloud technical aspect for the consulting team to provide the technical of AWS cloud consulting to customers.
- Design AWS Cloud solution architecture in response to the client s requirement.
- Define the scope of work & estimate mandays for cloud implementation.
- Managing cloud project delivery to meet the customer requirements timeline.
- Support AWS, GCP cloud partner competency building e.g. AWS Certification and delivery professional service process and documentation.
- Speaker of AWS technical side for True IDC webinar, online event for CloudTalk.
- Key Driving for building team competency expansion to meet the competency roadmap yearly strategy e.g. DevOps, IaC, Automation, Kubernetes, App modernization on AWS cloud.
- Experience in leading cloud AWS implementation and delivery team.
- Experience of designing and implementing comprehensive Cloud computing solutions on various Cloud technologies for AWS, GCP is plus.
- Experience in infra as a code in cloud native (Cloud Formation) or other e.g. Terraform, Ansible implementation.
- Experience in building multi-tier Service Oriented Architecture (SOA) applications.
- Knowledge of Linux, Windows, Apache, IIS, NoSQL operations as its architecture to the Cloud.
- Knowledge of OS administrative for both Windows and UNIX technologies.
- Knowledge of key concerns and how they are addressed in Cloud Computing such as security, performance and scalability.
- Knowledge of Kubernetes, Containers and CI/CD, DevOps.
- Experience with RDBMS designing and implementing over the Cloud.
- Prior experience with application development on the various development solutions as Java,.Net, Python etc.
- Experience in,.Net and/or Spring Framework and RESTful web services.
- UNIX shell scripting.
- AWS Certified Solution Architect - Associate, Prefer Professional level.
Skills:
Cloud Computing, RESTful, JSON
Job type:
Full-time
Salary:
negotiable
- Bachelor or master s degree in computer and Telecommunication Engineering, Computer Science, IT or in a related field.
- 8 - 13 years of experience in the Computer or Telecommunication field.
- Good Knowledge on cloud computing & edge computing technology.
- Good understanding on infrastructure technic that related of TCP/IP, Switch, Router, Firewall, LBS, and DNS.
- Good understanding technic that related of IoT/M2M/MEC Network Protocols - HTTP, HTTPS, Restful, MQTT, COAP, JSON objects, API, SNMP.
- Operating System knowledge: Linux-Redhat, CenOS, Windows Server.
- Database knowledge - Mongo DB, NoSQL DB, SQL, PostgreSQL.
- Good understanding of Docker and Kubernetes operations.
Skills:
ETL, Java, Python
Job type:
Full-time
Salary:
negotiable
- Design, develop, optimize, and maintain data architecture and pipelines that adhere to ETL principles and business goals.
- Create data products for analytics and data scientist team members to improve their productivity.
- Lead the evaluation, implementation and deployment of emerging tools and process for analytic data engineering in order to improve our productivity as a team.
- Develop and deliver communication and education plans on analytic data engineering capabilities, standards, and processes.
- RequirementsPrevious experience as a data engineer or in a similar role.
- Technical expertise with data models, data mining, and segmentation techniques.
- Knowledge of programming languages (e.g. Java and Python).
- Hands-on experience with SQL database design using Hadoop or BigQuery and experience with a variety of relational, NoSQL, and cloud database technologies.
- Great numerical and analytical skill.
- Worked with BI tools such as Tableau, Power BI.
- Conceptual knowledge of data and analytics, such as dimensional modeling, ETL, reporting tools, data governance, data warehousing, structured and unstructured data.
Skills:
Continuous Integration, Software Development, Java
Job type:
Full-time
Salary:
negotiable
- Design and develop new features for our POS systems.
- Maintain and enhance existing POS applications.
- Collaborate with product owners to understand requirements and deliver solutions.
- Integrate POS systems with backend services, payment gateways, and third-party APIs.
- Ensure the POS systems are secure, efficient, and reliable.
- Perform code reviews, write unit tests, and participate in continuous integration.
- Troubleshoot and resolve issues related to POS software.
- Document development processes, code, and system configurations.
- Bachelor s degree in Computer Science, Information Technology, or a related field.
- Minimum of 5 years of software development experience, focusing on POS systems.
- Proficient in programming languages such as C++, C#, Java.
- Experience with POS hardware and peripherals (printers, scanners, payment terminals).
- Strong knowledge of databases (SQL, NoSQL), data management, and messaging using MQ.
- Experience with RESTful APIs and web services.
- Understanding of software development methodologies (Agile, Scrum).
- Strong problem-solving skills and attention to detail.
- Excellent communication and teamwork skills.
Job type:
Full-time
Salary:
negotiable
- The Backend Engineer is responsible for the system to work behind the scenes. The role is to integrate
- the application with all required systems and services and provide clean and easy-to-use API for the
- frontend part of the system. Although the work is usually hidden from the user, the Backend Engineer
- bears the full responsibility for the application functionality, performance and scalability. The Backend
- Engineer knows how to write clean modern APIs, integrate with legacy systems and work with databases.
- Design and develop the business logic and backend systems of the product
- Work closely with frontend developers to design and develop functional, performing and complete APIs
- Decipher existing companies software systems and be able to hook in application to applicable data
- sources
- Write both unit and integration tests, and develop automation tools for daily tasks
- Develop high quality, well documented, and efficient code
- Challenge ideas and opinions to avoid pitfalls and inefficient solutions.
- You have experience as a backend engineer in common languages and frameworks, e.g., ExpressJS,
- NestJS or any other JS framework
- You have deep knowledge of object-oriented programing and engineering principles such as SOLID
- You have significant experience writing and utilizing autonomous services-oriented restful API services
- and performance tuning largescale apps
- You have experience with database systems, with knowledge of SQL and NoSQL stores, e.g., MySQL,
- MongoDB, Redis, Postgres
- You are able to write effective unit, integration, and API tests. It is a plus if you experience integrating
- with JavaScript frameworks, such as React.
Skills:
ETL, Python, Java
Job type:
Full-time
Salary:
negotiable
- Design, develop, and maintain scalable data pipelines and ETL processes.
- Implement and optimize data storage solutions, including data warehouses and data lakes.
- Collaborate with data scientists and analysts to understand data requirements and provide efficient data access.
- Ensure data quality, consistency, and reliability across all data systems.
- Develop and maintain data models and schemas.
- Implement data security and access control measures.
- Optimize query performance and data retrieval processes.
- Evaluate and integrate new data technologies and tools.
- Mentor junior data engineers and provide technical leadership.
- Collaborate with cross-functional teams to support data-driven decision-making.
- RequirementsBachelor's or Master's degree in Computer Science, Engineering, or a related field.
- 5+ years of experience in data engineering or related roles.
- Strong programming skills in Python, Java, or Scala.
- Extensive experience with big data technologies such as Hadoop, Spark, and Hive.
- Proficiency in SQL and experience with both relational and NoSQL databases.
- Experience with cloud platforms (AWS, Azure, or GCP) and their data services.
- Knowledge of data modeling, data warehousing, and ETL best practices.
- Familiarity with data visualization tools (e.g., Tableau, Power BI).
- Experience with version control systems (e.g., Git) and CI/CD pipelines.
- Strong problem-solving skills and attention to detail.
- Excellent communication and collaboration skills.
- 1