Jr. Software Engineer/Data Engineer
Requirements:
- Must be a US Permanent Resident (no sponsorship available)
- Must already have a Bachelor's degree. Candidates still in pursuit of a Bachelor's will not be considered
Top Skills:
- Basic proficiency with any cloud platform, preferably GCP data and analytics services like BigQuery, PubSub, and Composer.
- Basic understanding of designing and building production data pipelines in GCP using Java or Python.
- Some hands-on experience with Kafka and Airflow/Composer DAGs.
- Basic knowledge of data model design and optimization in GCP.
Responsibilities:
- Assist in ensuring technology solutions meet necessary criteria (resiliency, stability, etc.)
- Support ongoing development, maintenance, and support of existing systems and platforms
- Collaborate with team members and provide feedback
- Help identify subject matter experts and contribute to performance reviews
- Assist in owning and communicating the health and sustainability of technology solutions
- Contribute to innovation and continuous improvement
- Perform other duties as assigned
Technical Skillset:
- Some experience with large data models and data architecture principles.
- Basic proficiency in GCP services such as Cloud Storage, BigQuery, CloudSQL, Dataflow, Composer, etc.
- Basic programming skills in Python, Java, Node.js, etc.
- Understanding of cloud computing principles (IaaS, PaaS, SaaS).
- Some experience with Google Cloud SDK and APIs for programmatic interactions.
- Basic knowledge of containerization technologies like Docker and orchestration platforms like Kubernetes.
- Understanding and some experience in serverless computing using GCP services like Cloud Functions and Cloud Run.
- Experience setting up CI/CD pipelines using tools like GitHub.
- Basic proficiency with development tools and IDEs like Google Cloud Shell, Cloud Code for VS Code, or Cloud Tools for IntelliJ.