Middlewares and some string util functions refactored. Added partial Documentation.
This commit is contained in:
67
README.md
67
README.md
@ -19,13 +19,13 @@ See scraper [Options](https://godoc.org/github.com/geziyor/geziyor#Options) for
|
||||
## Status
|
||||
Since the project is in **development phase**, **API may change in time**. Thus, we highly recommend you to use Geziyor with go modules.
|
||||
|
||||
## Usage
|
||||
## Examples
|
||||
Simple usage
|
||||
|
||||
```go
|
||||
geziyor.NewGeziyor(geziyor.Options{
|
||||
StartURLs: []string{"http://api.ipify.org"},
|
||||
ParseFunc: func(r *geziyor.Response) {
|
||||
ParseFunc: func(g *geziyor.Geziyor, r *geziyor.Response) {
|
||||
fmt.Println(string(r.Body))
|
||||
},
|
||||
}).Start()
|
||||
@ -42,21 +42,76 @@ func main() {
|
||||
}).Start()
|
||||
}
|
||||
|
||||
func quotesParse(r *geziyor.Response) {
|
||||
func quotesParse(g *geziyor.Geziyor, r *geziyor.Response) {
|
||||
r.DocHTML.Find("div.quote").Each(func(i int, s *goquery.Selection) {
|
||||
r.Geziyor.Exports <- map[string]interface{}{
|
||||
g.Exports <- map[string]interface{}{
|
||||
"text": s.Find("span.text").Text(),
|
||||
"author": s.Find("small.author").Text(),
|
||||
}
|
||||
})
|
||||
if href, ok := r.DocHTML.Find("li.next > a").Attr("href"); ok {
|
||||
r.Geziyor.Get(r.JoinURL(href), quotesParse)
|
||||
g.Get(r.JoinURL(href), quotesParse)
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
See [tests](https://github.com/geziyor/geziyor/blob/master/geziyor_test.go) for more usage examples.
|
||||
|
||||
## Installation
|
||||
|
||||
## Documentation
|
||||
|
||||
### Installation
|
||||
|
||||
go get github.com/geziyor/geziyor
|
||||
|
||||
### Making Requests
|
||||
|
||||
Initial requests start with ```StartURLs []string``` field in ```Options```.
|
||||
Geziyor makes concurrent requests to those URLs.
|
||||
After reading response, ```ParseFunc func(g *Geziyor, r *Response)``` called.
|
||||
|
||||
```go
|
||||
geziyor.NewGeziyor(geziyor.Options{
|
||||
StartURLs: []string{"http://api.ipify.org"},
|
||||
ParseFunc: func(g *geziyor.Geziyor, r *geziyor.Response) {
|
||||
fmt.Println(string(r.Body))
|
||||
},
|
||||
}).Start()
|
||||
```
|
||||
|
||||
If you want to manually create first requests, set ```StartRequestsFunc```.
|
||||
```StartURLs``` won't be used if you create requests manually.
|
||||
You can make following requests using ```Geziyor``` methods:
|
||||
- ```Get```: Make GET request
|
||||
- ```GetRendered```: Make GET and render Javascript using Headless Browser.
|
||||
As it opens up a real browser, it takes a couple of seconds to make requests.
|
||||
- ```Head```: Make HEAD request
|
||||
- ```Do```: Make custom request by providing *geziyor.Request
|
||||
|
||||
|
||||
```go
|
||||
geziyor.NewGeziyor(geziyor.Options{
|
||||
StartRequestsFunc: func(g *geziyor.Geziyor) {
|
||||
g.GetRendered("https://httpbin.org/anything", g.Opt.ParseFunc)
|
||||
g.Head("https://httpbin.org/anything", g.Opt.ParseFunc)
|
||||
},
|
||||
ParseFunc: func(g *geziyor.Geziyor, r *geziyor.Response) {
|
||||
fmt.Println(string(r.Body))
|
||||
},
|
||||
}).Start()
|
||||
```
|
||||
|
||||
|
||||
|
||||
## Roadmap
|
||||
|
||||
If you're interested in helping this project, please consider these features:
|
||||
|
||||
- Command line tool for: pausing and resuming scraper etc. (like [this](https://docs.scrapy.org/en/latest/topics/commands.html))
|
||||
- Automatic item extractors (like [this](https://github.com/andrew-d/goscrape#goscrape))
|
||||
- Deploying Scrapers to Cloud
|
||||
- ~~Automatically exporting extracted data to multiple places (AWS, FTP, DB, JSON, CSV etc)~~
|
||||
- Downloading media (Images, Videos etc) (like [this](https://docs.scrapy.org/en/latest/topics/media-pipeline.html))
|
||||
- Realtime metrics (Prometheus etc.)
|
||||
|
||||
|
Reference in New Issue
Block a user