
Data Science Intern
Arthashastra Intelligence
internship
Posted on:
Location Type: Remote
Location: India
Visit company websiteExplore more
Job Level
About the role
- Analyze large amounts of information to discover trends and patterns.
- Web-scrape and build data pipelines using Airflow and PostgreSQL.
- Build dashboards on Apache Superset and configure the pipeline.
- Host the pipeline on AWS instances.
Requirements
- Must have knowledge of SQL, Python, and PySpark.
- Must have familiarity with any of the SQL and NoSQL databases.
- Must have familiarity with ETL and ELT pipelines.
- Must be an expert in web scraping using Python Selenium and Beautiful Soup.
- Must have knowledge of the basics of data warehousing concepts.
- Must have familiarity with Apache Airflow, Apache NiFi, or Kafka (advantage).
- Must have experience using business intelligence tools (e.g. Tableau) and data framework.
- Must be comfortable with any of the BI tools such as Power BI or Tableau.
- Must have a BSc/B.Tech in computer science, engineering or any other relevant degree with a good certification in data engineering would be preferred.
Applicant Tracking System Keywords
Tip: use these terms in your resume and cover letter to boost ATS matches.
Hard Skills & Tools
SQLPythonPySparkweb scrapingSeleniumBeautiful SoupETLELTdata warehousingApache Airflow
Certifications
BSc in computer scienceB.Tech in engineeringdata engineering certification