Feature #3706

Simple Decrypter

Added by raztoki over 3 years ago. Updated 14 days ago.

Status:Closed Start date:08/30/2011
Priority:Normal Due date:
Assignee:jiaz % Done:

100%

Category:-
Target version:020 - Next Release 2.0
Resolution:Fixed

Description

Came up with this idea some time ago, but here it is!

Categories
Decrypting / GUI / Controlling

Goal
Create an easy to use user interface for 'simple decryption' tasks. Easy access to JD link catching and parsing features without the need for JDTEAM/users to make plugins or manually 'adding links' in linkgrabber for simple decrypting tasks.

Outcome
Using the existing features of JD, let users create their own decrypter(s) within easy to use interface! Currently a user needs to manually insert into the linkgrabber 'Add Links' or use custom addons in their browsers. Addons are not necessarily available in each browser, this will allow more control for users who use clipboard monitoring, but also gives improves simple addons in browsers! This should also reduce false positives *.(swf|jpg|png|etc) if configured correctly.

Proposed User Interface
create user interface like the link grabber ignore/allow stuff, but for customisable user decrypters
+ option to enable/disable each rule
+ 'filter name' or 'filter name grouping' (grouping may not be needed, as you can name 1a 1b 1c etc as long as you can sort list order)
+ 'regeps' to catch (eg domain name or domain/xx/).
+ options on scan type: quick scan based on our already existing filters but on groupings like 'hoster and/or decrypter and/or file extension filter (mp3/avi/etc)'. Or more 'Advanced' option to just catch regep string of their choosing over our predefined plugin groupings.
+ option to disregard custom linkgrabber settings (ignores/allows). Should still obey custom linkgrabber filters by default.
+ option that allows one custom decrypter rule/group to trigger another. (default: off) Could be directly related to 'filter name grouping' though not exclusively so
+ option to halt 'deep crawling' from processing any further, based on how many steps its taken. (Process Watchdog) Preventing endless loops (eg: default 1 =). Each rule governs itself. Advise against unlimited (0) times and set hard upper limit.
+ option to leave current domain for deep crawling purposes (default: no). not recommended
+ basic authentication (user:pass) (use a quick link back into the main basic authentication db?)
+ upload feature which users can share these filters on some live database. Give search ability to the live database or locally stored db like modem\router list).

+ global control to enable/enable simple decrypters

How
This is based around crawling and parsing of websites, finding matches and returning results.

The Process
1) user creates 'simple decrypter' rule
2) user copies url into clipboard or passes it into linkgrabber by api
3) linkgrabber matches it against 'simple decrypter' regep
4) 'simple decrypter' processes the following task (crawler + parser + result ++ repeat)
5) passes results into other standard decrypter/hoster plugins (if required, not governed by process watchdog)
6) returns final result(s).

Conclusion
This concept would highly benefit JDownloader, JD user base, even down to our coding team! As it would reduce the amount of workload in making/maintaining decrypters for simple websites/tasks. Quite often the request of simple decrypters gets declined, leaving the user unsatisfied.

This wont necessarily replace decrypters as they are still needed for more complicated tasks, like multiple step tasks and captcha processing. Giving users flexibility of making there own simple filters within seconds!

Keen to hear your thoughts?


Related issues

related to Feature #3668: Auto deep decryption Closed 08/20/2011
related to Feature #3672: Linkgrabber: Manipulation of links within linkgraber sort... Closed 08/21/2011
related to Feature #2785: selectable extensions Closed 12/24/2010

History

Updated by pspzockerscene over 3 years ago

Nice idea but i think it's better for JD FarFar away^^

Updated by raztoki over 3 years ago

maybe I'm too optimistic grins

Updated by raztoki over 3 years ago

  • Target version changed from 010 - Next Major Public Release 1.xxx to 040 - FarfarAway

Updated by jiaz 14 days ago

Advanced Settings- Linkcrawler rules
for example
[{"pattern":".*\\.cyr$","name":"extension","rule":"DIRECTHTTP","id":1418037876514,"enabled":true}, {"pattern":".*daniel\\.intranet\\.appwork\\.org","name":"deepdecrypt","rule":"DEEPDECRYPT","id":1418037876515,"enabled":true}]

Updated by jiaz 14 days ago

  • Status changed from Waiting for Feedback to Closed
  • % Done changed from 0 to 100
  • Resolution set to Fixed

implemented in backend. currently available via advanced config

Updated by raztoki 14 days ago

  • Assignee set to jiaz
  • Target version changed from 040 - FarfarAway to 020 - Next Release 2.0

Also available in: Atom PDF