Tech Stack
JavaScriptPandasPythonSpark
About the role
- 负责Scrapy爬虫项目的运维以及数据的清洗提取工作。
- 根据任务需求,进行网页端或App端的数据爬取工作。
- 在保证质量的情况下可以成功交付数据。
- 可以独立解决开发中碰到的问题,并具备良好的沟通能力。
- 对新鲜技术感兴趣,有责任感,乐于分享,协助团队技术成长
Requirements
- 2年以上工作经验,计算机,工程或相关学科本科及以上学历
- 熟悉Python
- 熟悉网络抓取原理、HTTP协议,了解常见的反爬虫原理
- 熟练使用requests, Scrapy, BS4, xpath, regex等工具进行网络请求和数据提取
- 熟悉数据清洗,能够利用Pandas/Spark进行数据处理
- 熟悉版本控制工具(例如git), 熟悉数据库相关技术
- 了解JS逆向技术,了解常见的JS反爬手段,熟悉JS调试、定位、注入等
- 了解安卓逆向技术,掌握抓包、hook等技术原理
- 有海外外卖类平台经验者优先
- 英文熟练者优先
- Great office location right in the city center of Shanghai
- Dashmote Flex: a mix of working from home and the office + 20 days of working remotely (anywhere in the world!)
- Market-competitive holiday policy
- Company Laptop
- Referral Bonus
- Annual learning budget
- Annual health check
- Working for a company that was awarded the best B2B startup in Europe by Google, McKinsey, and Rocket Internet
- Working within an international team that truly values your contribution
- An awesome culture of responsibility and the freedom to turn your ambition into reality, regardless of your role and level
- Exciting work atmosphere with no shortage of fun team events, gatherings, and snacks both in person and online
Applicant Tracking System Keywords
Tip: use these terms in your resume and cover letter to boost ATS matches.
Hard skills
Python网络抓取原理HTTP协议反爬虫原理数据清洗PandasSparkJS逆向技术安卓逆向技术数据提取
Soft skills
沟通能力责任感乐于分享独立解决问题团队协作