Skip to main content

This job has expired

Hadoop Data Architect

Employer
Yoh Services, LLC
Location
Bowie, MD
Closing date
Sep 22, 2019

View more

Data Architect - HadoopThe Data Architect is responsible for providing leadership in the architecture, technical design and enabling design for scale using Big data and relational database technologies; drive pragmatic solutions working with a team ensuring the highest quality software, transparency and predictability of releases, and ensure adherence to the agile principles to facilitate customer value and time to market.ResponsibilitiesWork with staff members to develop pragmatic and detailed data architecture to support continuing evolution of the company's data architecture Support technical architecture and design of systems in a way that ensures compliance with certification, regulatory standards, and a technology based information security program; Ensure teams follow best practices regarding coding standards, code reviews, and testing (including unit, integration, and system test); Assess technological options and design offerings supporting scalable, high-performance, and highly available environments; Participate with Company leadership in the strategic development of technology initiatives to identify product and system enhancements which will improve customer and stakeholder value; Partner with Engineers and architects to ensure developed solutions adhere to established best patterns and our architectural target state; Provide technical thought leadership towards solving problems for the team; Drive the adoption of key engineering best practices to improve quality and reliability of team's deliverables; Coordinate and communicate with senior and executive management to ensure goals are met within budget; Collaborate with other technologists on creating cross-domain solutionsQualifications Bachelor degree (or higher) in Computer Science, MIS, IT, or related field; 10 years of experience in developing large-scale, distributed data-centric solutions with at least 5 years with Java or C#; 10 years of experience in leading the development and delivery of Big Data, Data Warehousing, and Data Integration solutions; 3 years of software development experience in Hadoop environment (Hive, Pig, HBase); 3 years of experience using open source message broker software such as RabbitMQ and/or Kafka; Knowledge or working experience of Data Processing Algorithms and Design Patterns; Experience working with big data technologies and high volume transactional systems ; Strong understanding of ETL process design and automation and recent relevant work experience; Strong understanding of database design and development; Strong understanding of architecture patterns and operational characteristics of highly available and scalable applications; Working knowledge of self-service experiences and open source web application technology stack is a plus; Experience with persisting data in one or more Relational and NoSQL DB technologies such as MS SQL Server, MongoDB, Cassandra, CouchDB or Postgres; Experience with API gateways and authentication technologies, such as OAuth2 and SAML; Experience building service-oriented architectures and proficiency with distributed systems built on the cloud (AWS, Azure, Google Cloud) Experience building large-scale web services, microservices based applications in the cloud environment is a plus; Passion for security and a strong understanding of threats, vulnerabilities and compliance standards; Experience participating and leading code reviews, refactoring, gathering code quality metrics and measurements; Fluency with modern DevOps automation toolchains, the AWS ecosystem, and modern containerization, orchestration, and virtualization technologies; and Experience using source control tools - Microsoft Team Foundation Server, and/or Github is a plus, but not required.

Get job alerts

Create a job alert and receive personalized job recommendations straight to your inbox.

Create alert