AWS Data Engineer - Long Term Project - Remote Contract Opportunity
Genuent's premiere client is hiring an AWS Data Engineer for a long-term, fully remote project. Hiring manager based in New York, NY. Qualified Candidates, please email your professional resume to tmuskat@genuent.com.
Job Requirements:
6+ years of experience in data pipeline engineering for both batch and streaming applications
Must be hands-on coding capable in at least a core Language skill of (Python, Java or Scala) with Spark
Expertise in working with Distributed DW and Cloud Services (like Snowflake, Redshift, AWS, etc.) via scripted pipelines for at least 2 years
This role intersects with 'Big Data' stack to enable varied Analytics, ML, etc. Not just 'DW' type workload
Business Domains like Sales & Marketing, Direct to Consumer, Adsales of particular interests
Handling large and complex sets of XML, JSON, and CSV from various sources and databases
Solid grasp of database engineering and design
Leveraged frameworks & orchestration like Airflow as required for ETL pipeline
Identify bottlenecks and bugs in the system and develop scalable solutions
Unit Test & Document deliverables
Capacity to successfully manage a pipeline of duties with minimal supervision
Required Technical Skills:
Very high test score Python, SQL, DW Concepts & Logic
A core Language skill of (Python, Java or Scala) with big data frameworks such as Spark is must
Variety of EC2, EMR, RDS, Redshift, Snowflake
Strong SQL
Other SQL based databases, like Oracle, SQL Server, etc.
AWS Cloud knowledge
Nice to have: Working knowledge of message queuing, stream processing, and highly scalable 'big data' data stores