Careers

Design your career with us

We are continuously seeking strong candidates with a diverse set of backgrounds to become a part of our team. We sometimes recruit directly from university and we also look for well-seasoned professionals that have vast experience in leading or taking part in advanced analytics, data science, or data engineering teams.

Do not hesitate to get in touch with us if you would like to join a multidisciplinary team that applies the most advanced data techniques in a challenging but rewarding environment.

Please send us an e-mail to careers@bedrockdbd.com if the roles below do not fit your needs, but you are still interested in joining our team. Do not forget to attach your CV and a brief Cover Letter so that we can get to know you better.

All positions currently opened:

Junior Data Scientist Dev Ops Engineer/Data Architect

We are continuously seeking strong candidates with a diverse set of backgrounds to become a part of our team. We sometimes recruit directly from university and we also look for well-seasoned professionals that have vast experience in leading or taking part in advanced analytics, data science, or data engineering teams.

Do not hesitate to get in touch with us if you would like to join a multidisciplinary team that applies the most advanced data techniques in a challenging but rewarding environment.

Please send us an e-mail to careers@bedrockdbd.com if the roles below do not fit your needs, but you are still interested in joining our team. Do not forget to attach your CV and a brief Cover Letter so that we can get to know you better.

All positions currently opened:

Junior Data Scientist Dev Ops Engineer/Data Architect

Junior Data Scientist

What the team needs:

A data enthusiast that has a passion for learning, innovating, and that would love to participate in international projects related to Data Science and Artificial Intelligence.

You would join our team as a Junior Data Scientist, supporting our team throughout the complete data science and engineering workflow: cleaning and processing data sets, performing feature engineering tasks, developing machine learning and mathematical models, creating data visualisations and validating results. As part of the data operations team, you would be building advanced analytical systems, enabling organisations to view, consume and understand their data, automate processes, augment human decision making, and generate business insights.

It would be a good start if you demonstrate:

  • Proactivity and autonomous problem-solving skills.
  • Excellent command of both written and oral English.
  • Strong Math fundamentals.
  • Good programming skills in Python, R or Matlab.
  • Generic Artificial Intelligence and Machine Learning knowledge.

It would be awesome if you:

  • Have previously worked with Cloud computing solutions.
  • Are interested in marketing and media data.Have some experience with IoT solutions.
  • Use a Git account and you have developed your own ML projects portfolio.
  • Used data representation/visualisation tools in the past such as Tableau/PowerBI.

What we offer you:

  • A real career progression.
  • Being part of a mission with an honest “why”. Democratising Data Science and Artificial Intelligence to empower people at organisations around the world.
  • Feeling a foundational part of our team. All of our colleagues are encouraged to make autonomous decisions and your proactivity can really have an impact on the business.
  • Working in a team where you expand your knowledge and gain valuable technical experience every single day.
  • A working model based on trust, self-responsibility and self-management.
  • A remote-friendly environment. Working from home when you need or when you want to do so.
  • An attractive and cosy working environment in our innovation hub, a rural and modern office at La Pipa (www.lapipa.io), Gijón (Somió), Spain.
  • At Bedrock, we aim to be a reference in talent, humility and collaboration.

Dev Ops Engineer /Data Architect

What the team needs:

A seasoned data and cloud computing enthusiast that has experience in designing, deploying, and managing developments of data infrastructures and that is also interested in the discipline of data infrastructures. We would expect the candidate to have some experience as a DevOps Engineer, as a Data Architect or in similar roles. You will be entitled to review the specifications of new data applications and you will assist our data engineers in designing, creating, deploying, and managing our clients’ data architecture. Some tasks will be shared with our Data Engineers that already take care of processes related to the generation, storage, maintenance, preparation, enrichment and distribution of data. You will support your fellow data science colleagues by setting up environments for them to perform exploration, analysis and visualisation tasks. All in all, you will be expected to facilitate better coordination among data operation teams under a DevOps philosophy, supporting project development, and testing functions by automating and streamlining both the integration and deployment processes.

It would be a good start if you demonstrate:

  • Proactivity and autonomous problem-solving skills.
  • To be fluent in English language.
  • Relevant experience with:
  • Comfortable programming skills in Python and maybe using Bash.

    · Cloud Computing: AWS, Azure or GCP.

    · ETL orchestration architectures.

    · Container-based i.e. Docker or Swarm.

    · Monitoring, logging, scheduling and triggers.

    · Provisioning VMs.SQL databases.

  • Experience in Cloud environment administration at AWS, Azure and GCP: Packer + Terraform + Ansible.
  • Experience in working under an umbrella of Continuous Integration (CI) and Continuous Development (CD) i.e. Jenkins, Azure DevOps, Cloud Build, or similar.

It would be awesome if you:

  • Know how to set up dev environments based on Jupyter Notebooks, Visual Studio Code, RStudio or Juno.
  • Are familiar with Databricks, Apache Airflow, Apache Hadoop, Apache Kafka and Apache Spark for data extraction and transformation tasks.
  • Have worked in integrating data flows into visualisation tools.

What we offer you:

  • A real career progression.
  • Being part of a mission with an honest “why”. Democratising Data Science and Artificial Intelligence to empower people at organisations around the world.
  • Feeling a foundational part of our team. All of our colleagues are encouraged to make autonomous decisions and your proactivity can really have an impact on the business.
  • Working in a team where you expand your knowledge and gain valuable technical experience every single day.
  • A working model based on trust, self-responsibility and self-management.
  • A remote-friendly environment. Working from home when you need or when you want to do so.
  • An attractive and cosy working environment in our innovation hub, a rural and modern office at La Pipa (www.lapipa.io), Gijón (Somió), Spain.
  • At Bedrock, we aim to be a reference in talent, humility and collaboration.