- 3 years of tangible Python development experience
- Proficient English
- Develop clean, elegant, well-commented, and reusable code with version control (Git)
- Min 2 years of distributed data crawling techniques (requests, scrapy, selenium with PhantomJs, etc)
- Knowledge in any one of the processing & queuing libraries (RabbitMQ, RQ, Redis, Celery) to handle large amount of data
- Knowledge of cloud deployment strategies in AWS, Azure using server orchestration tools
- Experience with Data mining and Text mining techniques
At Dploy, we implement strategic solutions for technology company to build a committed offshore team in Indonesia. Our solution is flexible, scalable and sustainable to meet the ever-changing needs of your business. Responsibilities Dploy Asia is looking for python enthusiast who likes to build best-in-class python crawlers and data processing pipe, as well as:
- Design and develop a highly scalable, data crawlers to extract large volumes of data from website
- Wrangle the raw data to get cleaned, normalized, and enriched data sets using transformations, normalization and mapping.
- Work in Agile / Scrum environments with remote team.
- Develop creative ideas on how to work better and smarter.
Salary and Benefit
- Spacious and comfortable office in the heart of the town
- 24/7 internet unlimited access at high speed
- Free lunch provided
- Free-flow snacking time
- Outing benefit
- Health insurance
- Monthly allowance