Roles & Responsibilities
- Understand customer requirements and render those as architectural models that will operate at large scale and high performance. Where customers have architectures prepared, validate them against non-functional requirements and finalize the build model.
- Work alongside customers to build data management platforms using Elastic Map Reduce (EMR), Redshift, Kinesis, Amazon Machine Learning, Amazon Athena, Lake Formation, S3, AWS Glue, DynamoDB, ElastiCache and the Relational Database Service (RDS)
- Well experience on other AWS services EC2, EMR, Redshift, S3, Streaming services like Kafka, Kinesis, HDFS.
- Conversion of Hive SQL to AWS Glue based sql.
- Security of EMR cluster using KMS keys or customer managed keys
- Render working, high performance data management solutions, as CloudFormation and reusable artifacts for implementation by the customer. Bootstrap using user data scripts will be added advantage
- Prepare architecture and design briefs that outline the key features and decision points of the application built in the Data Lab
- Work with customers to advise on changes as they put these systems live on AWS
- Extract best-practice knowledge, reference architectures, and patterns from these engagements for sharing with the worldwide AWS solution architect community
Basic Qualifications
- Highly technical and analytical, possessing 3 or more years of Database and/or Analytics Systems development and deployment experience, IT systems and engineering experience, security and compliance experience, etc.
- Possess significant experience of software development and/or IT and implementation/consulting experience.
- Strong verbal and written communications skills are a must, as well as the ability to work effectively across internal and external organizations and virtual teams.
- Ability to think understand complex business requirements and render them as prototype systems with quick turnaround time.
- Implementation and tuning experience in the Big Data Ecosystem, (such as EMR, Hadoop, Spark, R, Presto, Hive), Database (such as Oracle, MySQL, PostgreSQL, MS SQL Server), NoSQL (such as DynamoDB, HBase, MongoDB, Cassandra, design principles) and Data Warehousing (such as Redshift, Teradata, Vertica, schema design, query tuning and optimization) and data migration and integration.
- Track record of implementing AWS services in a variety of business environments such as large enterprises and start-ups.
- Knowledge of foundation infrastructure requirements such as Networking, Storage, and Hardware Optimization.
- BS level technical degree required; Computer Science or Mathematics background preferred. [DB1]
- AWS Certification, eg. AWS Solutions Architect, AWS Developer, or AWS Certified Big Data – Specialty (Data Analytics Speciality)
Preferred Qualifications
- Hands on experience leading large-scale global database, data warehousing and analytics projects.
- Demonstrated industry leadership in the fields of Big Data, Database Migration, Data Warehousing, Data Sciences and Advance Analytics.
- Deep understanding of data, application, server, and network security
- Experience with Statistics, Machine Learning and Predictive Modelling.
- Hands on experience as a database, data warehouse, big data/analytics developer or administrator, or work as a data scientist.
- Experience working within the software development or Internet industries is highly desired.
- Technical degrees in computer science, software engineering, or mathematics
- Working knowledge of modern software development practices and technologies such as agile methodologies and DevOps.
Other Skills: