Skip to main content

This job has expired

AWS Data Engineer - Remote

Employer
ClearScale
Location
Washington, DC
Closing date
Sep 27, 2021

View more

ClearScale is a leading cloud systems integration company and AWS Premier Consulting Partner providing a wide range of cloud services including: cloud consulting, architecture design, migration, automation, application development, and managed services.We help Fortune 500 enterprises, mid-sized business, and startups in verticals like: Healthcare, Education, Financial Services, Security, Media and Technology succeed with ambitious, challenging, and unique cloud projects. We architect, develop, and launch innovative and sophisticated solutions using the best cutting-edge cloud technologies.ClearScale is growing quickly and there is high demand for the services we provide. Clients come to us for our deep experience with Big Data, Containerization, Serverless Infrastructure, Microservices, IoT, Machine Learning, DevOps and more.ClearScale is looking for an experienced Data Engineer to participate in a custom data pipeline development project.Responsibilities:Migrate data located in a multitude of data stores, into the Data Lake.Orchestrate processes to ETL that data, slice it into the various data marts. Manage access to the data through Lake FormationBuild a data deliver pipeline to ingest high volume of the real-time streams, detect anomalies, slice into the window analytics, put those results in the Elastic search system for the further dashboard consumptionAnalyze, scope and estimate tasks, identify technology stack and toolsDesign and implement optimal architecture and migration planDevelop new and re-architecture solution modules, re-design and re-factor program codeSpecify the infrastructure and assist DevOps engineers with provisioningExamine performance and advise necessary infrastructure changesCommunicate with client on project-related issuesCollaborate with in-house and external development and analytical teamBasic QualificationsHands-on experience designing efficient architectures for high-load enterprise-scale applications or 'big data' pipelinesHands-on experience utilizing AWS data toolsets including but not limited to DMS, Glue, Data Brew, EMR, SCTPractical experience in implementing of big data architecture and pipelinesHands-on experience with message queuing, stream processing and highly scalable 'big data' storesAdvanced knowledge and experience working with SQL and noSQL databasesProven experience in re-design and re-architecting of the large complex business applicationsStrong self-management and self-organizational skillsSuccessful candidates should have experience with any of the following software/tools (not all required at the same time):Python and PySpark - strong knowledge especially with developing Glue jobsBig data tools: Kafka, Spark, Hadoop (HDFS3, YARN2,Tez, Hive, HBase)Stream-processing systems: Kinesis Streaming, Spark-Streaming, Kafka Streams, Kinesis AnalyticsAWS cloud services: EMR, RDS, MSK, Redshift, DocumentDB, LambdaMessage queue systems: ActiveMQ, RabbitMQ, AWS SQSFederated identity services (SSO): Okta, AWS CognitoPreferred QualificationsWe are looking for a candidate with 5+ years of experience in Data, Cloud or Software Engineer role, who has attained a degree in Computer Science, Statistics, Informatics, Information Systems or another quantitative fieldUsage of HUDI with AWS Data LakesGraph databases development and optimization 3+ yearsNeo4j, SPARQL, GREMLIN, TinkerPop, Pregel, Cypher, Graph Databases, Amazon Neptune, Knowledge GraphsValid AWS certificates would be a great plusWhat's in it for you?Competitive Salary! Excellent Medical Benefits Generous Vacation Benefit - Uncapped Paid Time OffOpportunity to build a leadership career in the fast-growing Cloud industry with an industry leader.Collaborative, high-energy culture100% Remote Opportunity - distributed workforce - everyone works from home!Learning opportunitiesPowered by JazzHRZBfjD3wIvp

Get job alerts

Create a job alert and receive personalized job recommendations straight to your inbox.

Create alert