|Target version:||020 - Next Release 2.0|
Came up with this idea some time ago, but here it is!
Decrypting / GUI / Controlling
Create an easy to use user interface for 'simple decryption' tasks. Easy access to JD link catching and parsing features without the need for JDTEAM/users to make plugins or manually 'adding links' in linkgrabber for simple decrypting tasks.
Using the existing features of JD, let users create their own decrypter(s) within easy to use interface! Currently a user needs to manually insert into the linkgrabber 'Add Links' or use custom addons in their browsers. Addons are not necessarily available in each browser, this will allow more control for users who use clipboard monitoring, but also gives improves simple addons in browsers! This should also reduce false positives *.(swf|jpg|png|etc) if configured correctly.
Proposed User Interface
create user interface like the link grabber ignore/allow stuff, but for customisable user decrypters
+ option to enable/disable each rule
+ 'filter name' or 'filter name grouping' (grouping may not be needed, as you can name 1a 1b 1c etc as long as you can sort list order)
+ 'regeps' to catch (eg domain name or domain/xx/).
+ options on scan type: quick scan based on our already existing filters but on groupings like 'hoster and/or decrypter and/or file extension filter (mp3/avi/etc)'. Or more 'Advanced' option to just catch regep string of their choosing over our predefined plugin groupings.
+ option to disregard custom linkgrabber settings (ignores/allows). Should still obey custom linkgrabber filters by default.
+ option that allows one custom decrypter rule/group to trigger another. (default: off) Could be directly related to 'filter name grouping' though not exclusively so
+ option to halt 'deep crawling' from processing any further, based on how many steps its taken. (Process Watchdog) Preventing endless loops (eg: default 1 =). Each rule governs itself. Advise against unlimited (0) times and set hard upper limit.
+ option to leave current domain for deep crawling purposes (default: no). not recommended
+ basic authentication (user:pass) (use a quick link back into the main basic authentication db?)
+ upload feature which users can share these filters on some live database. Give search ability to the live database or locally stored db like modem\router list).
+ global control to enable/enable simple decrypters
This is based around crawling and parsing of websites, finding matches and returning results.
1) user creates 'simple decrypter' rule
2) user copies url into clipboard or passes it into linkgrabber by api
3) linkgrabber matches it against 'simple decrypter' regep
4) 'simple decrypter' processes the following task (crawler + parser + result ++ repeat)
5) passes results into other standard decrypter/hoster plugins (if required, not governed by process watchdog)
6) returns final result(s).
This concept would highly benefit JDownloader, JD user base, even down to our coding team! As it would reduce the amount of workload in making/maintaining decrypters for simple websites/tasks. Quite often the request of simple decrypters gets declined, leaving the user unsatisfied.
This wont necessarily replace decrypters as they are still needed for more complicated tasks, like multiple step tasks and captcha processing. Giving users flexibility of making there own simple filters within seconds!
Keen to hear your thoughts?
Updated by raztoki over 3 years ago
- Target version changed from 010 - Next Major Public Release 1.xxx to 040 - FarfarAway
Advanced Settings- Linkcrawler rules
- Status changed from Waiting for Feedback to Closed
- % Done changed from 0 to 100
- Resolution set to Fixed
implemented in backend. currently available via advanced config