3.0.0.10

Andy
2022-05-23 15:52:51 +03:00
parent 75c4c1ef53
commit 5a747db867
2 changed files with 38 additions and 5 deletions

31
Home.md

@@ -46,9 +46,9 @@ More about channels [here](https://github.com/AAndyProgram/SCrawler/wiki/Channel
**[Hash 2](https://github.com/AAndyProgram/SCrawler/wiki/Settings#how-to-find-instagram-hash-2) and cookies required to download saved Instagram posts** **[Hash 2](https://github.com/AAndyProgram/SCrawler/wiki/Settings#how-to-find-instagram-hash-2) and cookies required to download saved Instagram posts**
Go to Settings - Settings - Reddit/Instagram. Enter your username in the "Saved posts user" textbox. Click OK. Go to the main window. Go to Settings - Settings - Reddit/Instagram/Twitter. Enter your username in the "Saved posts user" textbox. Click OK. Go to the main window.
This button looks like a bookmark. When you click on this button, a form for downloading saved posts will open. Saved posts are stored in the Reddit/Instagram data path in the ```!Saved posts``` folder. This button looks like a bookmark. When you click on this button, a form for downloading saved posts will open. Saved posts are stored in the Reddit/Instagram/Twitter data path in the ```!Saved posts``` folder.
![Saved posts window](https://github.com/AAndyProgram/SCrawler/blob/main/ProgramScreenshots/SavedPosts.png) ![Saved posts window](https://github.com/AAndyProgram/SCrawler/blob/main/ProgramScreenshots/SavedPosts.png)
@@ -60,9 +60,12 @@ This button looks like a bookmark. When you click on this button, a form for dow
- ```Download all site users``` - Download all users marked ```Ready for download``` from specific sites. - ```Download all site users``` - Download all users marked ```Ready for download``` from specific sites.
- ```Download all FULL``` - Download all users from all sites. **The ```Ready for download``` option will be ignored.** - ```Download all FULL``` - Download all users from all sites. **The ```Ready for download``` option will be ignored.**
- ```Download all site users FULL``` - Download all users from specific sites. **The ```Ready for download``` option will be ignored.** - ```Download all site users FULL``` - Download all users from specific sites. **The ```Ready for download``` option will be ignored.**
- ```Add a new download group``` -
- ```Download video``` - download a separate video (Reddit and Twitter videos supported) or Instagram post (photo and video). - ```Download video``` - download a separate video (Reddit and Twitter videos supported) or Instagram post (photo and video).
- ```Stop``` - stop all download operations. - ```Stop``` - stop all download operations.
![Saved posts window](https://github.com/AAndyProgram/SCrawler/blob/main/ProgramScreenshots/MainWindowGroups.png)
### View ### View
- ```View``` - users view modes. - ```View``` - users view modes.
@@ -174,6 +177,30 @@ If you want to merge the collection.
Files from the "Reddit_SomeUserName1" and "Twitter_SomeUserName1" folders will be moved to the "First collection" folder. Files from the "Reddit_SomeUserName1" and "Twitter_SomeUserName1" folders will be moved to the "First collection" folder.
# Download groups
In many cases, you may need to download some users. You can group these users and make it easier to download them without having to select them every time.
It is very easy to create a new group. Just click the ```Add a new download group``` button (main window - ```Download all``` menu) to create a new one.
- ```Name``` - group name
- ```Temporary``` - users marked as temporary (indeterminate state to not use this option) will be downloaded
- ```Favorite``` - users marked as favorite (indeterminate state to not use this option) will be downloaded
- ```Ready for download``` - users marked as ```Ready for download``` will be downloaded
- ```Ignore ready for download``` - this option tells the program to ignore the ```Ready for download``` user option and download the user anyway
- ```Labels``` - You can select labels. Only users who have one or more of these labels will be downloaded.
**Only those users who match all of these parameters (logical operator ```AND```) will be downloaded.**
![Group creator](https://github.com/AAndyProgram/SCrawler/blob/main/ProgramScreenshots/GroupCreating.png)
For each group, SCrawler creates a new menu, which is placed in the ```Download all``` menu of the main window. If a group has a number on the left (1-9), that group can be downloaded using ```Ctrl```+```Number```. Each group also has several options:
- ```Edit``` - edit group
- ```Delete``` - delete group
- ```Download``` - download with the options you have set
- ```Download FULL``` - Download with the options you have set. ```Ready for download``` option will be ignored.
![Groups context](https://github.com/AAndyProgram/SCrawler/blob/main/ProgramScreenshots/MainWindowGroups.png)
# Black list # Black list
The program has the ```Black list```, which included users you have added to. It is only a blacklist of the program and no impact on the site! The program has the ```Black list```, which included users you have added to. It is only a blacklist of the program and no impact on the site!

@@ -20,7 +20,11 @@
- ```Show notifications``` - show notifications when download is complete - ```Show notifications``` - show notifications when download is complete
- ```Fast profiles loading``` - fast loading profiles in the main window. **Be careful with this setting. Fast loading leads to the highest CPU usage.** - ```Fast profiles loading``` - fast loading profiles in the main window. **Be careful with this setting. Fast loading leads to the highest CPU usage.**
- ```Delete data to recycle bin``` - delete data to recycle bin or permanent - ```Delete data to recycle bin``` - delete data to recycle bin or permanent
- ```Open the Info form when the download start```
- ```Open the Progress form when the download start```
- ```Don't open again``` - Do not automatically open the corresponding form if it was once closed
- ```Folder cmd``` - the [command](#folder-command) to open a folder - ```Folder cmd``` - the [command](#folder-command) to open a folder
- ```Close cmd``` - this [command](#scrawler-script-text-examples) will be executed when SCrawler is closed
![Basis settings](https://github.com/AAndyProgram/SCrawler/blob/main/ProgramScreenshots/SettingsGlobalBehavior.png) ![Basis settings](https://github.com/AAndyProgram/SCrawler/blob/main/ProgramScreenshots/SettingsGlobalBehavior.png)
@@ -42,6 +46,7 @@
- ```Time``` - append time to file name - ```Time``` - append time to file name
- Date positions ```Start/End``` - date and/or time will be appended to the end or beginning of the file name - Date positions ```Start/End``` - date and/or time will be appended to the end or beginning of the file name
- ```Script``` - [script](#how-to-use-the-script) to be executed after the user download is complete. If the checkbox is checked, new users will be created with the ```Use script``` option. - ```Script``` - [script](#how-to-use-the-script) to be executed after the user download is complete. If the checkbox is checked, new users will be created with the ```Use script``` option.
- ```After download cmd``` - this [command](#scrawler-script-text-examples) will be executed after all downloads are completed
![Default settings](https://github.com/AAndyProgram/SCrawler/blob/main/ProgramScreenshots/SettingsGlobalDownloading.png) ![Default settings](https://github.com/AAndyProgram/SCrawler/blob/main/ProgramScreenshots/SettingsGlobalDownloading.png)
@@ -107,6 +112,7 @@ The Reddit parser can parse data without cookies, but you can add it if you like
- Authorization - Authorization
- ```Hash``` - in this field you need to put the hash of the Instagram session. Just add cookies and click on the curved arrows. - ```Hash``` - in this field you need to put the hash of the Instagram session. Just add cookies and click on the curved arrows.
- ```Hash 2``` - **\[For saved Instagram posts only\]** in this field you need to put the [hash of the Instagram session for saved posts](#how-to-find-instagram-hash-2). - ```Hash 2``` - **\[For saved Instagram posts only\]** in this field you need to put the [hash of the Instagram session for saved posts](#how-to-find-instagram-hash-2).
- ```x-csrftoken``` - **\[For Stories and Tagged data only\]** read [here](#how-to-find-instagram-stories-authorization-headers) how to find them
- ```x-ig-app-id```, ```ix-ig-www-claim``` - **\[For Stories and Tagged data only\]** read [here](#how-to-find-instagram-stories-authorization-headers) how to find them - ```x-ig-app-id```, ```ix-ig-www-claim``` - **\[For Stories and Tagged data only\]** read [here](#how-to-find-instagram-stories-authorization-headers) how to find them
- ```Saved posts user``` - your personal Instagram username to download your saved posts (this feature requires cookies and **InstaHash 2**) - ```Saved posts user``` - your personal Instagram username to download your saved posts (this feature requires cookies and **InstaHash 2**)
- Other parameters - Other parameters
@@ -138,7 +144,7 @@ The Reddit parser can parse data without cookies, but you can add it if you like
### Instagram limits ### Instagram limits
Instagram API is requests limited. For one request, the program receive only 50 posts. Before catching error 429, the program can process 200 requests. I reduced this to 195 requests and set a timer to wait for the next request after. Instagram API is requests limited. For one request, the program receive only 50 posts. Before catching error 429, the program can process 200 requests. I reduced this to 195 requests and set a timer to wait for the next request after. This was added to bypass error 429 and prevent account ban.
## RedGifs ## RedGifs
@@ -203,11 +209,11 @@ New-Item -Path "$p\TestDir" -ItemType Directory
SCrawler Batch command: SCrawler Batch command:
``` ```
D:\MyPrograms\SocialCrawler\Script.ps1 "{0}" D:\MyPrograms\SocialCrawler\Script.bat "{0}"
``` ```
SCrawler Batch command 2: SCrawler Batch command 2:
``` ```
D:\MyPrograms\SocialCrawler\Script.ps1 D:\MyPrograms\SocialCrawler\Script.bat
``` ```
SCrawler PowerShell command: SCrawler PowerShell command:
``` ```