[Ext] Clean Links 2.4

Announce and Discuss the Latest Theme and Extension Releases.
Post Reply
User avatar
diegocr
Posts: 182
Joined: July 7th, 2008, 1:02 pm
Contact:

Re: [Ext] Clean Links 2.4

Post by diegocr »

MC_Fat_Tongue wrote:White-listing just the full domain is over kill in some situations, when sometimes it's just the link that is needed.


Yeah, that's precisely why that message on the whitelisting panel. The feasible way atm is just inspecting the original/cleaned links there and manually add the tag(s) to the "Skip Links..." option, which shouldn't be a headache to do unless you have dozens of links you want to filter... other than that, i'll think about it. Thanks for your feedback :)


WebVoyager wrote:For instance following script sends the page I call the bookmarklet from to Unmaskparasites.com :
javascript: url = 'http://www.UnmaskParasites.com/security-report/?page='+encodeURIComponent(location.href); void(window.open(url));


To skip bookmarklets you have to add javascript:| at the beginning of the "Skip Links..." option, if additionally you want Unmaskparasites links to be left untouched you have to add its domain to the "Skip Domains.." option or right-clicking the toolbar button (as long it has been cleared at least once on the current session you'll find it there for easy whitelisting)


Lew_Rockwell_Fan wrote:Is there some kind of "Clean Links for Dummies" manual somewhere?


There is no formal documentation other than what is provided on the AMO listing page, since i don't think it's needed. Basically, any nested or obfuscated link will be intercepted and cleaned before you visit it, that's all :)

Also, you don't need to mess with the command-line, that's a browser extension for Firefox...

Ideally the default options should meet most users needs, there are however cases where you'll want to whitelist some sites or adding new [tracking] tags to be cleaned from URLs.
You have your way. I have my way. As for the right way, the correct way, and the only way, it does not exist...
https://addons.mozilla.org/user/diegocr/
JPL5780
Posts: 10
Joined: April 14th, 2008, 1:03 am
Location: Surrey, UK

Re: [Ext] Clean Links 2.4

Post by JPL5780 »

Hi,

When Clean Links is enabled, it is not possible to Sign Out from https://myaccount.sky.com/

JPL
PadaV4
Posts: 308
Joined: October 14th, 2013, 1:20 pm

Re: [Ext] Clean Links 2.4

Post by PadaV4 »

Since addons.mozilla.org is no place to have a discussion, im posting here. You said
"So... i told you how to resolve your "issue" by whitelisting the domain, and you come back here to blame the add-on? Very professional and friendly mate."
Im sorry but you haven't told me anything. The review i left before was deleted almost immediately. Apparently they don't like links in reviews.

The only thing that was written in your response to my review before it was deleted. Was 2 sentences.
First was saying i make no sense.
Second was saying im spreading FUD.
Well maybe you edited your response after that and "resolved my issue". but i have no way of verifying that because my one star review has disappeared along with your response.
User avatar
diegocr
Posts: 182
Joined: July 7th, 2008, 1:02 pm
Contact:

Re: [Ext] Clean Links 2.4

Post by diegocr »

@JPL5780

You just have to whitelist "myaccount.sky.com" adding it to the "Skip Domains..." option.


@PadaV4

Yeah, that was my initial answer because you said you wasn't able to resolve the "issue" by "tinkering" with the options, which sorry but didn't make sense to me.

Then i did posted a new answer to specify what you have to do. I.e: adding "pasts.tvnet.lv" to the skip domains option or through right-clicking the toolbar button. I think you should be receiving by email replies to your reviews, unless you have provided a disposable/fake email address on AMO.

Also, In your last review you're stating "It touches and rewrites links it shouldn't"... Well, if a link contains a nested url - or tracking tags (such as those UTM on your posted links) - the link is indeed touched/rewritten, that's the purpose of the add-on after all.
You have your way. I have my way. As for the right way, the correct way, and the only way, it does not exist...
https://addons.mozilla.org/user/diegocr/
JPL5780
Posts: 10
Joined: April 14th, 2008, 1:03 am
Location: Surrey, UK

Re: [Ext] Clean Links 2.4

Post by JPL5780 »

Diego,

Thanks for your help

I had already tried whitelisting sky.com, but this did not work. I assumed that included sub-domains, but it appears not.

Whitelisting skyid.sky.com (not myaccount...) does the trick. Problem solved.

JPL
bege
Posts: 153
Joined: January 23rd, 2009, 9:14 pm
Location: Germany

Re: [Ext] Clean Links 2.4

Post by bege »

Hi,
please, add to the toolbar button a link leading to the options for whitelisting.
Thank you.
piotrkustal
Posts: 3
Joined: May 18th, 2014, 12:03 pm

Re: [Ext] Clean Links 2.4

Post by piotrkustal »

When can we expect plain text -> clickable urls feature to be implemented in your awesome extension?
User avatar
diegocr
Posts: 182
Joined: July 7th, 2008, 1:02 pm
Contact:

Re: [Ext] Clean Links 2.4

Post by diegocr »

@bege

Are you aware starting with 2.5 you can right-click the toolbar button for easy whitelisting?

@piotrkustal

There are some good add-ons on AMO which does that already, so i think it'd be a little redundant adding that feature on CleanLinks... also that will involve traversing the DOM on each page load which i won't like to do (Event Delegation is turned on by default nowadays so that'd be a move backwards, slowing down the browser)

However, rather than converting text-links to "visually clickable" links, the ED mode could obviously still catch clicks on text-nodes and check if there's a readable link, so no need to traverse nor deal with the DOM. I'll consider implementing that.
You have your way. I have my way. As for the right way, the correct way, and the only way, it does not exist...
https://addons.mozilla.org/user/diegocr/
halpls
New Member
Posts: 1
Joined: July 13th, 2014, 3:05 am

Re: [Ext] Clean Links 2.4

Post by halpls »

Hi, I'm having trouble using CleanLinks on slickdeals.com.

for example. using a random page on slickdeals:
\http://slickdeals.net/f/7058208-lg-42-inch-led-42ln5400-1080p-120hz-299-dell?v=1

All the links to other websites on that page are redirected through slickdeals.
This is what shows when I copy the amazon link there:
\http://slickdeals.net/?abt=5:26&sdtid=7058208&sdop=1&sdpid=69160140&sdfid=9&sdfib=1&lno=2&trd=http+www+amazon+com+LG+Electron+&u2=http%3A%2F%2Fwww.amazon.com%2FLG-Electronics-42LN5400-42-Inch-1080p%2Fdp%2FB00BB9ORUS
WebVoyager
Posts: 160
Joined: November 20th, 2004, 4:58 am

Re: [Ext] Clean Links 2.4

Post by WebVoyager »

I could manage without Clean Links add-on, but I'd miss it.

A slight redundant (confirmed) problem here.
Whenever another add-on is either updated or added, the Clean Links button disappears. I have to restart Firefox twice to have it back on the toolbar.
I'm with Pale Moon 24.6.2 x64 (~Firefox 24 ESR) on Windows 7 64-bit.

Anyone else experiencing this issue?

EDIT : in order to enhance my chances of receiving a reply I'll remove Pale Moon to stand on Firefox 24 ESR :) Hope that helps ...
I'm no longer here -- Cannot delete/close my profile -- Bye
DAOWAce
Posts: 45
Joined: April 29th, 2012, 4:59 pm
Location: US East
Contact:

Re: [Ext] Clean Links 2.4

Post by DAOWAce »

This breaks a few lot of things.

Nexusmods is completely broken if you try to navigate the mod details. It works for one click, then once the stuff gets put in the URL, cleanlinks breaks everything. Was forced to whitelist the site as I don't know enough to fix this by myself, which is a shame, because the navigation of the site adds a ton of junk URL history. Wish they'd do something about that to hide it from the browser.

It also breaks clicking certain things in various other sites, such as clicking review helpfulness on Amazon.

I also can't seem to figure out how to remove certain things from certain URLs, such as Newegg's ?nm_mc= stuff. ex: http://www.newegg.com/special/shellshoc ... id=3588348

I've put "nm" "nm_mc" in both the ? and utm_ sections, but it's not seemed to affect anything. There's also &nm_mc= URLs. I am at a loss; I really don't understand these things.

Edit: Also breaks logins on NCSoft.com and Wordpress.com Edit2: And a lot more sites.

Also causes some links to open in the same tab instead of in a new one.
User avatar
diegocr
Posts: 182
Joined: July 7th, 2008, 1:02 pm
Contact:

Re: [Ext] Clean Links 2.4

Post by diegocr »

@halpls:

Yup, there's a known issue with the Copy Link controller option: https://github.com/diegocr/CleanLinks/issues/47

@WebVoyager

Well, here we've discussed a similar issue when upgrading the browser, nothing about when upgrading add-ons as well, though.

This may or may not help, feel free to give it a try in any case: https://support.mozilla.org/en-US/kb/tr ... et-firefox


DAOWAce wrote:This breaks a few lot of things.


Well, that's why there exists the Skip Links and Skip Domains (whitelist) options in the first place :)

DAOWAce wrote:Nexusmods is completely broken if you try to navigate the mod details.


If you don't want to whitelist the whole domain, add navtag| to the beginning of the Skip Links option. (navtag is the query/url parameter holding the nested url while navigating there)

DAOWAce wrote:It also breaks clicking certain things in various other sites, such as clicking review helpfulness on Amazon.


https://github.com/diegocr/CleanLinks/issues/52

DAOWAce wrote:I also can't seem to figure out how to remove certain things from certain URLs, such as Newegg's ?nm_mc= stuff. ex: http://www.newegg.com/special/shellshoc ... id=3588348

I've put "nm" "nm_mc" in both the ? and utm_ sections, but it's not seemed to affect anything. There's also &nm_mc= URLs. I am at a loss; I really don't understand these things.


"nm" would match ..aspx?nm=foo - it doesn't ..aspx?nm_mc=foo nor ..aspx?foo_nm=bar

You need to add nm\w+| at the beginning of the Remove From Links option.

DAOWAce wrote:Edit: Also breaks logins on NCSoft.com and Wordpress.com Edit2: And a lot more sites.


If these are issues which cannot be resolved by whitelisting, feel free to file issues in the github tracker.

DAOWAce wrote:Also causes some links to open in the same tab instead of in a new one.


Enable the Follow Target Attribute option.
You have your way. I have my way. As for the right way, the correct way, and the only way, it does not exist...
https://addons.mozilla.org/user/diegocr/
DAOWAce
Posts: 45
Joined: April 29th, 2012, 4:59 pm
Location: US East
Contact:

Re: [Ext] Clean Links 2.4

Post by DAOWAce »

Thanks for the explanations. I'm very very slowly beginning to understand how to manage these regexp expressions.

The login issues I mentioned was due to one of the removals I had set. I fixed it a bit after posting. Still, after adding a bunch of stuff over the last few months it's caused back and forth breaks between things which leads me to try and whitelist things without breaking removals. I assume it's because I'm throwing a bunch of stuff under one group of removals/whitelists. If I understood regexp better I could get things working right, but even reading about it is confusing me.

I have this under removals: (?:ref|aff|snr|list|nxid|ie|qid|sid)
And this at the end of the whitelist: ?url=|magnet:|nextpage|list_of_subs|callback|base|return|locale|cont

I don't doubt there's something wrong there, but again, I don't understand how to properly write these things out.

Do you have a good tutorial site to learn about regexp you could link me? I'm trying to reference this but I feel like a bit of an idiot reading it.
User avatar
Soul Stealer
Posts: 480
Joined: March 31st, 2007, 1:18 pm
Location: God's Country

Re: [Ext] Clean Links 2.4

Post by Soul Stealer »

I'm sure that somewhere in these 118 posts that you've already answered this question but either I can't find it, or I missed it ...

When I click this link:

Code: Select all

http://links.govdelivery.com/track?type=click&enid=Z(lots of characters here)Ym&&&100&&&http://www.fda.gov/Safety/Recalls/ucm424607.htm?source=govdelivery&utm_medium=email&utm_source=govdelivery


I get:

Code: Select all

http://www.fda.gov/Safety/Recalls/ucm424607.htm?source=govdelivery


That itself is enough to make me ecstatic but I'm wondering if the stuff from the question mark on shouldn't be removed as well?
It's like I said.
gilsayer
Posts: 2
Joined: December 2nd, 2011, 9:20 pm

Re: [Ext] Clean Links 2.4

Post by gilsayer »

Hi,

More than one question, I am afraid: Is there anyway to make Clean Links work only when copying an url with the context menu's "copy link location?" (i.e. it will not clean the URL when one does a left-click).
(probably, Clean Links is "too improved" for such a rather simple task. Yet, I could not find any other add-on that could do what I described above... )

And/ or, can one use the "whitelist" in reverse, so to speak, i.e., allow Clean Links to work only on domains specified?

Many thanks
Post Reply