• 5004 E Fowler ave, suite E, Tampa, FL 33617
  • hr@ganait.com
  • +1 (813) 980 1212
  • +1 (813) 797 3077

Software Developer

Location: Remote / Tampa, FL

Experience: 0 Years

Skills: GCP, Python, Terraform, Docker, Jenkins, GitLab CI/CD, Kubernetes, JavaScript, REST APIs

Description:

  • Develop and deploy cloud-native applications on Google Cloud Platform (GCP).
  • Implement CI/CD pipelines using GitLab CI/CD and Jenkins for automated deployments.
  • Manage infrastructure as code using Terraform and Cloud Deployment Manager.
  • Build Docker images and manage container orchestration with Kubernetes (GKE).
  • Automate service deployments and configurations using shell scripts and Python.
  • Monitor cloud resources and set up logging/alerting using Stackdriver (Cloud Monitoring).
  • Integrate IAM roles, service accounts, and workload identity for secure access.
  • Write scalable backend APIs in Python and Node.js.
  • Use Firestore and Cloud SQL for managing application data.
  • Implement version control using Git, branching strategies, and merge pipelines.
  • Participate in code reviews and agile sprints for feature enhancements.
  • Write unit, integration, and automated tests to improve code quality.
  • Build responsive front-ends with HTML, CSS, Bootstrap, and JavaScript frameworks.
  • Collaborate with product managers and QA teams to release stable features.
  • Participate in performance tuning, cost optimization, and GCP billing analysis.

Apply Now

Data Engineer

Location: Hybrid – Tampa, FL

Experience: 4+ Years

Skills: BigQuery, Dataflow, Python, Airflow, ETL, GCS, Pub/Sub, SQL, Spark

Description:

  • Design, build, and manage data pipelines on Google Cloud using Dataflow and BigQuery.
  • Ingest structured and unstructured data from APIs, databases, and flat files.
  • Orchestrate workflows using Apache Airflow and Cloud Composer.
  • Write scalable data processing scripts in Python and PySpark.
  • Automate ETL/ELT processes and ensure data integrity and validation.
  • Use Pub/Sub for real-time streaming and event-based architecture.
  • Load and transform data in BigQuery using SQL, UDFs, and partitioning.
  • Optimize performance and cost for cloud storage and processing jobs.
  • Work with Data Studio and Looker for data visualization and dashboarding.
  • Implement GCS lifecycle policies for archival and retention of raw data.
  • Perform data quality checks and anomaly detection across multiple datasets.
  • Collaborate with analytics, product, and engineering teams to define data needs.
  • Document data dictionaries, pipeline configurations, and lineage.
  • Maintain secure access with IAM, VPC Service Controls, and DLP policies.
  • Conduct peer code reviews and implement continuous improvements in pipeline efficiency.

Apply Now
address1

Elevating Customer Experience.