Big Data Architect

Branch Region City Agreement type Publish date
Doradztwo / Konsulting IT Mazowieckie Warszawa Kontrakt B2B 2021-04-28

About the employer

ITFS is a dynamically developing company that provides services in the field of outsourcing IT specialists and the implementation of comprehensive IT projects. We cover the Tri-City, Wrocław, and Warsaw, we also develop our activities on European markets, incl. in Switzerland, Great Britain, the Netherlands, and Spain.

OFFER

Localization: 100% remote project;
Start: ASAP - 1 month notice period is accepted;
Contract: B2B - long term cooperation;
Rate: 110 - 140 PLN per hour;
Duration of the project: long-term cooperation

Responsibilities

  • The Big Data Architect will be responsible for guiding the full architectural lifecycle of a Big Data solution, including requirements analysis, governance, capacity requirements, technical architecture design (including hardware, O/S, and network topology), application design, testing, and deployment.
  • Provide technical direction in a team that designs and develops path-breaking large-scale cluster data processing systems.
  • Interact with domain experts, data architects, solutions architects, and analytics developers to define data models for streaming input and delivering analytics output.
  • Helping our Client’s Internal Partners develop strategies that maximize the value of their data.
  • Help establish thought leadership in the big data space by contributing internal papers, technical commentary to the user community.

Expectations

Candidates must be able to demonstrate in-depth knowledge of design in the following core technologies:
Requirement for a Big Data architect to create technical designs for complex Big Data solutions. Undertake the Technical Design Specification and BOM creation in support of sales opportunities and during the architectural phase of the project lifecycle related to Big Data
  • Hadoop stack (e.g. MapReduce, Sqoop, Pig, Hive, HBase, Flume, Spark, Kafka, Flink, Java) or equivalent and Candidate should have at least one or two projects delivered at Production Scale on Hadoop stacks or equivalent.
  • Nice to have any experience in cloud technologies: AWS, Azure, Google Cloud, Oracle Cloud

Offer

If you are interested, do not hesitate and apply :) 

Couldn't find an offer

that meets your competence or expectations?

Please send us your CV, we may be able to match the ideal offer for you