Description

Remote Big Data Administrator

Power the World’s Data—Shape Tomorrow from Anywhere

Are you driven by the challenge of scaling data platforms and transforming raw information into actionable insights? As a Remote Big Data Administrator, your expertise shapes builds and continually enhances high-impact data ecosystems that enable organizations to innovate and lead. With an annual salary of $125,383, this position gives you the autonomy to solve complex data puzzles, support mission-critical analytics, and influence how decisions are made, no matter where you call home.

About the Role: Your Impact on the Future of Data

Imagine being the architect behind seamless data flow in organizations serving millions of users. You'll take ownership of setting up and overseeing distributed systems, leveraging cloud-native solutions, and optimizing performance, supporting everything from instant analytics solutions to large-scale machine learning workflows. Your expertise in data infrastructure will allow teams to access, analyze, and trust their data, powering discoveries that change industries.

Our environment is fast-paced, forward-thinking, and collaborative. Every day, you’ll engage with cross-functional teams, from data engineers and scientists to business intelligence leads and product owners. This isn’t just about administration—it’s about setting new standards in how information is stored, accessed, and leveraged to drive measurable results.

Key Responsibilities: What You’ll Own

  • Architect, deploy and monitor large-scale distributed databases (Hadoop, Spark, Cassandra, or similar frameworks) to support high-throughput data processing.
  • Automate data pipeline management, implement robust data governance policies and safeguard sensitive information through advanced security protocols.
  • Tune data storage and retrieval for performance, ensuring sub-second query response times on petabyte-scale datasets.
  • Partner closely with technical specialists and data science professionals to optimize machine learning infrastructure, thereby accelerating model training and deployment.
  • Proactively identify system bottlenecks, analyze logs, and apply corrective action before issues escalate.
  • Document evolving workflows, create visual dashboards for usage analytics, and mentor teams on best practices in data stewardship.
  • Oversee migrations to cloud-based platforms, such as AWS, Azure, or Google Cloud, to minimize downtime and data risk.
  • Stay current on emerging tools, frameworks, and methodologies, contributing to the company’s reputation for technical innovation.

Technologies and Tools: Your Daily Arsenal

  • Distributed Data Systems: Hadoop, Apache Spark, Apache Kafka, Cassandra, HBase
  • Data Integration: Apache NiFi, Talend, Informatica, Airflow
  • Cloud Platforms: Amazon Web Services (AWS), Microsoft Azure, Google Cloud Platform (GCP)
  • Automation: Ansible, Terraform, Docker, Kubernetes
  • Monitoring & Security: Prometheus, Grafana, Splunk, Apache Ranger
  • Query Engines: Hive, Presto, Impala
  • Programming: Python, Scala, Java, Bash

You’ll have the opportunity to work with best-in-class platforms, influence tool selection, and test the latest technology in a production environment. Our teams celebrate experimentation and support ongoing learning with paid certifications and hands-on labs.

Why You’ll Love Working Here: Environment and Culture

Picture yourself collaborating in a remote-first, globally distributed team. Here, your voice matters—whether you’re leading a system upgrade, brainstorming new data security solutions, or mentoring a junior administrator. We utilize agile methodologies, frequent stand-ups, and transparent communication tools to maintain connection and drive progress.

  • Flexible work hours support your lifestyle—no matter your time zone.
  • Continuous learning is a core value, with access to exclusive tech conferences and online courses.
  • Transparent recognition: Data-driven metrics and real-time dashboards celebrate both individual and team success.
  • Our culture values curiosity, ownership, and a passion for innovation. We believe that bold ideas and diverse perspectives fuel meaningful breakthroughs.

Qualifications: What Sets You Apart

  • Proven experience in administering and optimizing distributed big data platforms in a remote or hybrid environment.
  • Deep familiarity with data lake architectures, cloud-native data storage, and data replication strategies.
  • Demonstrated proficiency in at least one programming language (Python, Scala, or Java) for automating administrative tasks.
  • Expertise in designing data backup, disaster recovery, and security protocols for high-availability systems.
  • Familiarity with compliance frameworks (GDPR, HIPAA, or similar) to ensure regulatory adherence.
  • Strong analytical, troubleshooting, and documentation skills—backed by examples of driving measurable improvement in system uptime, latency, or scalability.
  • Effective communication and collaboration abilities, demonstrated across multidisciplinary teams and virtual settings.
  • A degree in computer science, information systems, Systems, or a related field (or equivalent real-world experience).

Growth and Opportunity: Elevate Your Career

  • Lead major cloud migration initiatives, directly impacting our ability to scale globally.
  • Influence future technology investments by piloting emerging frameworks and tools.
  • Shape internal best practices, set new benchmarks for uptime and performance, and inspire the next generation of remote data professionals.
  • Be a visible contributor in an organization recognized for setting industry standards—our teams have managed over 50 petabytes of active data across five continents.

Ready to Take Charge of Big Data?

Join us as we push the boundaries of what’s possible in remote data administration. Here, your expertise will power transformative business outcomes, accelerate innovation, and shape the digital future from wherever you choose to work.

Apply today and help us unlock the next era of data-driven achievement—one terabyte at a time.