Java Architect + Bigdata needs 4+ years of experience developing enterprise-grade data integration solution
Java Architect + Bigdata requires:
• Bachelor's Degree with specialized coursework in Computer Science or Management Information Systems
• 4+ years of experience developing enterprise-grade data integration solution
• Good knowledge of Java and JVM, Python/JavaScript, C and Linux system (5+ years' experience required); you should be capable of programming in compiled and dynamic languages.
• Should have worked on Big Data and Hadoop platforms.
• Good understanding of Kafka
• You have expertise in data stores (both transactional and non-transactional), and can code in a highly concurrent environment.
• Prior ETL development job experience required; 2+ years of experience ideal
• Ability to operate effectively in ambiguous situations.
• Ability to learn quickly, manage work independently and is a team player
• Familiarity with Agile software development methodologies to ensure the early and continuous delivery of software
• Experience with Cassandra
• Modelling and API development
• Experience with Hadoop/Spark/Kafka
Java Architect + Bigdata duties:
• Define, develop, test, analyze, and maintain new and current processes, storage and distribution in support of the achievement of business requirements.
• Staging, coding, testing datasets and the analysis of data.
• Research, design, document, and modify data specifications throughout the production life cycle.