Data Architect (DA)

Closing date for applications: 30/12/2021
OVERVIEW OF ROLE
Specialist role
Data architect
Summary of the work
CAMLite is a workflow management tool which is used by internal DWP staff to update claimant information, the work is created for the users as TasksDWP are replacing the system with an AWS Cloud based service, using ARA components where possible, to replace the COTS Product.
Latest start date
01/02/2022
Expected contract length
Initial term will be 6 months with an optional 6 months extension
Location
No specific location, for example they can work remotely
Organisation the work is for
Department & Work & Pensions
Maximum day rate
Please specify required day rate
ABOUT THE WORK
Early market engagement
Who the specialist will work with
You will be comfortable working in a cross-functional team (including UX, analysts, statisticians, engineers, product owners, security risk, etc.).
What the specialist will work on
As a Big Data Architect you will be able to demonstrate expertise in leading enterprise data architecture and design for large complex businesses.Your credible track record of designing for Microservices, external data ingestion, real-time data analytics, domain-driven master data model, event-driven enterprise data-sharing, authoritative data sources and an external API exchange will be utilized in a continuously improving, challenging and satisfying environment.You’ll thrive creating value using large, diverse data volumes, e.g. by using Tableau, Clik, C3, Hadoop / Spark / SQL / Mongo Atlas/ Python to create modelling features from underlying transactional data.
WORK SETUP
Address where the work will take place
The Services will be aligned to a DWP Technology Hub (to include Manchester and Leeds), the majority of the Buyer’s Digital Workforce are currently working from home and the Supplier Services will be delivered remotely, this is anticipated to continue.
Working arrangements
Digital Workforce are currently working from home and the Supplier Services will be delivered remotely, this is anticipated to continue. The team currently work 5 full days, working hours are flexible provided the supplier covers 8 hours between 08.00am & 18.00pm
Security clearance
The individual shall comply with Baseline Personnel Security Standard (BPSS)/Government Staff Vetting Procedures in respect of all persons who are employed or engaged be the Supplier in the provision of this Call Off Contract prior to each individual commencing work.
ADDITIONAL INFORMATION
Additional terms and conditions
EVALUATION CRITERIA
How many specialists to evaluate
5
Cultural fit criteria
-Experience of working within agile teams remotely and collaboratively -Experience of GDS Standards -Proactive -Comfortable working in a cross functional team -Ability to communicate effectively to express freedom of thought and innovation, positively challenge and act as an advocate for change. -Evidence of working in a transparent and visible way that drives performance, ownership, trust and eliminates surprise. -Evidence of a culture of continuous improvement
Assessment methods
Evaluation weighting
Technical competence 40% Cultural fit 10% Price 50%
EXPERIENCE
Essential skills and experience
  • Expert level data modelling, conceptual through to physical, relational, object, analytical and NoSQL
  • At least 10 years data modelling experience in a software engineering environment
  • Expert in domain driven design and Microservices
  • Expert in domain driven design and Microservices
  • Hands own experience in Mongo DB, Hadoop, Tableau, Qlik and Spark
  • Scripting/Coding experience (e.g. Bash, Python, Perl)
  • Excellent experience with the Hadoop ecosystems (such as HDFS, YARN and/or Hive)
  • Strong experience with streaming and stream processing frameworks (such as Spark, Storm, Flink, Kafka and/or Kinesis)
  • Good knowledge of at least one of the following programming languages: Python, Scala, Go, Kotlin, Java
  • Experience with NoSQL databases (such as HBase, Cassandra and/or MongoDB)
  • Experience with public cloud-based technologies (such as Kubernetes, AWS, GCP, Azure, and/or Open stack)
  • Expert level in the creation and maintenance of enterprise data artefacts
  • Highly proficient in at least one query language for each of the following: relational; analytical; NoSQL
  • Proven innovator with proficient software engineering skills to prototype relational; analytical; NoSQL solutions
  • Highly motivated with experience of selecting and working with data and data tools
  • Experience of canonical modelling
Nice-to-have skills and experience
  • Capable of engaging senior stakeholders
  • Capable of developing and maintaining strong working relationships within the organisation and third party suppliers
  • Able to manage their time effectively and work proactively across projects as well as BAU tasks
  • Written and verbal skills to enable them to communicate complex data and information problems and solutions
  • Experience of developing data strategies, policies and standards
  • Have experience of working within DWP or a comparable organisation within the last 3 years
  • TOGAF Practitioner

Closing date for applications: 30/12/2021

Data Architect (DA)

Closing date for applications: 30/12/2021


Specialist role:

Data architect

Location:

No specific location, for example they can work remotely

Organisation:

Department & Work & Pensions

Maximum day rate:

Please specify required day rate

FULL DETAILS / EXPRESS INTEREST HERE
IT Recruitment Marketplace
The Hive Enterprise Centre, Victoria Avenue
Southend-on-Sea, Essex SS2 6EX
© IT Recruitment Marketplace
To change your subscription email us here