en flag +1 214 306 68 37

Data / Big Data Solutions Architect

Applications are now closed.

ScienceSoft is looking for Data / Big Data Solutions architect.

The digital platform will serve as a host to Customer Data Utilization and Customer Business Intelligence.

For internal and external customers are currently utilizing disparate systems to track information relevant to, but not limited to, key-to-key operations, profits, and inventory.

Digital platform will include data-lake, data-warehouse, analytical and data-science platform, Customer portal, interactive analytics and KPI dashboards. Information from several siloed systems will be fed via server integration services to a data lake and data-warehouse, which will in turn provide accessible data for Customer Data Utilization and Customer Business Intelligence.

Visit our Big Data service page to learn about the approach and competencies of our data analytics team and get an idea of potential projects.

Responsibilities

  • Collaboration with Customers to gather, assess and interpret client needs and requirements
  • Analyse structural requirements for new software and applications, translate into specifications to develop solutions.
  • Evaluating pros/cons among the identified options before arriving at a recommended solution optimal for the client’s needs.
  • Advising on data solutions archirecture, performance, altering the ETL process, providing SQL transformations, API integration, and deriving business and technical KPIs
  • Developing and delivering data solutions
  • Re-engineering business intelligence processes, designing and developing data models,DWH and Data Lake design.
  • Sharing your expertise throughout the deployment and RFP process.
  • Installing and configuring information systems to ensure functionality
  • Involvement in RFP and pre-sales activities

Requirements

  • Excellent verbal and written communication skills
  • Experience gathering and analyzing system requirements, analytical skills
  • Experience in Data solution achitecture, design and deployment skills
  • Expirience with Microsoft Azure ecosystem
  • Experience with AWS ecosystem (EMR,EC2, S3, RDS, Redshift, Aurora, Athena, RDS, PostgreSQL).
  • Engineering experience in large data systems on SQL, Hadoop, etc.
  • Expertise in Microsoft SQL, Oracle and/or other transactional databases
  • Experience with data warehousing.
  • Experience with ETL tools, ETL tech designs, Data Flows
  • Experience with business intelligence tools,Enterprise Reporting and data visualization tools (PowerBI, Tableau, Qlik etc...).
  • Experience in Data Science and Machine learning,data mining and segmentation techniques
  • Experience with NoSQL-based, SQL-like technologies (e.g., Hive, Pig, Spark SQL/Shark, Impala, BigQuery)
  • Experience in building Distributed Big Data Solutions including ingestion, caching, processing, consumption, logging & monitoringStrong.
  • Development Experience in streaming platforms(Kinesis, Kafka, Spark Streaming, Apache Flink)
  • Development Experience in batching platforms
  • Proficiency using one or more programming or scripting language such as: Python,Java, Scala, C#.
  • 7+ years of relevant work experience required.
  • Bachelor's degree in computer science, engineering or related field.

We Offer

  • Opportunity for professional self-realization
  • Friendly and united team
  • days of paid vacation
  • 100%-paid sick leave
  • Sport-program
  • Language courses and other corporate programs
  • Medical insurance
  • Competitive (official) salary.

Apply for this position