Data expert with AWS experience
Who are we?
We are a big telco company, big data is getting more and more important. We work with 15+ Agile teams on data gathering, analysis, creating data pipelines. It's a very nice entrepreneurial and international environment, English is the main language.
What are you going to do?
Work closely with product, data science and software engineering team on a development of a large-scale machine learning system (first version of the system is live already!)
Design, implement, test, deploy and monitor data storage components (e.g. AWS DynamoDB tables), data ingest components (e.g. AWS Kinesis data streams), data transformation pipelines (e.g. implemented with AWS Lambda) that support online (real-time) operations of a machine learning system
Design, implement, test, deploy and monitor interfaces (e.g. API Gateway) of the system with the external components.
Support integration of the AWS system with the external components (e2e integration & testing)
Where will you work?
The Hague area
What are you bringing?
Solid knowledge of AWS ecosystem – these are the technologies we work with:
AWS Step Functions
AWS CloudFormation / Terraform
Amazon API Gateway
Nice to have: Bash
Nice to have: R
Nice to have: Docker (containerization)
Nice-to-have: familiarity with creating pipelines for machine learning
Hands-on experience in defining and architecting AWS Big Data services along the entire data life cycle of collection, ingestion, storage, processing, and visualization
Experience with batch processing and online streaming data pipelines
Experience with agile / SCRUM
What do we offer you?
Paid and arranged relocation to Holland for you and family