Vollzeit Big Data Java Developer (m/w)
NOTE: This job listing has expired and may no longer be relevant!
Jobbeschreibung
Talend is a 500-employee, Big Data integration software company with deep open source roots. With over $100 million raised to date and continued rapid growth, we are one of the largest independent open source companies in the world.
As part of the Talend R&D Team, the Software Developer – Java/Big Data applicant will be responsible for designing, implementing, integrating and optimizing our core platform for big data processing which will be used across our Talend Platform and Cloud-Products. Within this position the development will mainly be focused on server side Java leveraging technologies from the Hadoop Ecosystem, like HTFS, Spark, Kafka and others. Which provides the applicant with the ability to work with up to date Big Data technologies, in a modern cloud based deployment environment.
Responsibilities
- Design, implement, test, and deploy a data processing infrastructure using Mesos/Yarn that is fault tolerant and scalable to support multiple Talend Products.
- Use different protocols as needed for different data services (NoSQL/JSON/REST/JMS/JMX) and provide related tests (Test-driven development/Unit Testing and automation)
- Document structure, process and design of all implemented solutions.
- Build High-Availability (HA) architectures and deployments primarily using big data technologies.
- Work and communicate in a cross-functional geographically dispersed team environment comprised of software engineers, product managers, software test engineers, and product support engineers
Qualifications
- BS degree in Computer Science or Computer Engineering, Mathematics, or equivalent experience.
- Extensive experience delivering results in rapid release cycles
- 2+ years’ professional Java programming with emphasis on data-oriented systems (RESTful, Message-/Event-Driven)
- Working knowledge of Big data Analytics frameworks like MapReduce, Spark and Cluster Resource Managers like Mesos/Yarn.
- Experience on Key technologies from the Apache Hadoop Ecosystem(Hadoop Distributions ( HDFS, Spark, Kafka would be ideal)
- An understanding of distributed and cloud computing, incl. deployment related experience (ideally including Docker, AWS)
- Experience working in Agile/Iterative/Scrum development experience.
- Experience with Test-driven development/Unit Testing and automation
- Ability to communicate effectively and clearly in English, both verbally and in writing.
- Self-motivated with strong organizational/prioritization skills and ability to multi-task with close attention to detail.
How to Apply
Please apply through this link:
https://app.jobvite.com/j?cj=o0iA2fwE&s=geekjobs.de
No views yet