User Help for Seamonkey and Mozilla Suite
Add-ons I have:
Show Parent Folder
Yahoo Mail Hide Ad Panel
Safe Mode didn't seem to show anything different.
In normal mode, I started SM and had yahoo.com (homepage) and this forum page open. Memory steadily was increasing and CPU hovering low. Restarted and opened the same tabs - this time no memory ramp and even lower CPU. When I've seen the very erratic CPU usage, I've had 4-8 tabs open, closing some didn't offer much change. It is like something on a page will trigger the unexpected behavior and it will stay that way until close.
2.46 appears to be multi-threaded. Is this true?
> 2.46 appears to be multi-threaded. Is this true?
No, not at all.
Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US; rv:18.104.22.168) Gecko/20110420 SeaMonkey/2.0.14 Pinball CopyURL+ FetchTextURL FlashGot NoScript
> No, not at all.
SeaMonkey is and was multithreaded but only in one process.
Then how is it I'm suddenly getting >50%CPU utilization for Seamonkey.exe on a quad-core CPU? Is it spawning child processes of some other EXE?
(and this happens just rendering yahoo.com)
SM definitely quieted down with JS off. Is there a change in the JS engine that makes it be 'busier' than previous generations?
Not that I'm aware of. Mind you, I'm also not aware of the constant changes going on to the coding of all the commercial websites out there either. Website coding can change daily, but still look the same.
For example - a search results page on Google just sits there, doesn't refresh yet screams away at the CPU, why? You'd think that it must be because of enhanced functionality or something, yet bizarrely, when you turn off JS on that site, the functionality improves! Hidden search options and filters appear!
For sites that are a permanent pain, like Google and others, I use YesScript, which is a simple one click blacklist that stops JS running on a list of chosen sites. For anything else, I just use a simple JS toggle button, but rarely need it.
Oddly, different people sometimes get different results on the Net. For example, I bet that for some people Google 'works just fine'. Well, good for them, but I just know what sites don't work great for me and block the JS on them or move on.
What catches people out is that they assume that you'd need 20 or 30 sites open for the CPU* to scream away - nope, just 1 'rogue' one will do it.
* high RAM use is pretty tiresome, but it's high CPU use that grinds things to a halt.
I installed Noscript and tried it a bit. I think it does help. But what a pain - I have to set options for almost every page, then even then sometimes it is not clear which domain's script has broken the page.
Appreciate the hint, 4t8s.
If that's too much of a pain, try Cascading permissions mode - https://forums.informaction.com/viewtopic.php?p=85899#p85899
You can still block individual sites' scripts using the Untrusted sub-menu.
*Always* check the changelogs BEFORE updating that important software!
Who is online
Users browsing this forum: No registered users and 0 guests