EXPERIENCE
Essential skills and experience
- Expert level data modelling, conceptual through to physical, relational, object, analytical and NoSQL
- At least 10 years data modelling experience in a software engineering environment
- Expert in domain driven design and Microservices
- Expert in domain driven design and Microservices
- Hands own experience in Mongo DB, Hadoop, Tableau, Qlik and Spark
- Scripting/Coding experience (e.g. Bash, Python, Perl)
- Excellent experience with the Hadoop ecosystems (such as HDFS, YARN and/or Hive)
- Strong experience with streaming and stream processing frameworks (such as Spark, Storm, Flink, Kafka and/or Kinesis)
- Good knowledge of at least one of the following programming languages: Python, Scala, Go, Kotlin, Java
- Experience with NoSQL databases (such as HBase, Cassandra and/or MongoDB)
- Experience with public cloud-based technologies (such as Kubernetes, AWS, GCP, Azure, and/or Open stack)
- Expert level in the creation and maintenance of enterprise data artefacts
- Highly proficient in at least one query language for each of the following: relational; analytical; NoSQL
- Proven innovator with proficient software engineering skills to prototype relational; analytical; NoSQL solutions
- Highly motivated with experience of selecting and working with data and data tools
- Experience of canonical modelling
Nice-to-have skills and experience
- Capable of engaging senior stakeholders
- Capable of developing and maintaining strong working relationships within the organisation and third party suppliers
- Able to manage their time effectively and work proactively across projects as well as BAU tasks
- Written and verbal skills to enable them to communicate complex data and information problems and solutions
- Experience of developing data strategies, policies and standards
- Have experience of working within DWP or a comparable organisation within the last 3 years
- TOGAF Practitioner