Above is a picture of the current modules that I have installed for Rachel. Im running Rachel as a client side offline server. Im using Ubuntu 18.04.3. I installed linkchecker and ran it on www.rachel.comlinkchecker www.rachel.com > linkcheck.csv and sent the errors to a csv file. I left the system running over night and when I came back it had froze after running for 21 hours and 1 minute(pic in a reply). Who can I reach out to help get this fixed or at least let them know that this problem exist. At 3 hours into running linkchecker there were already over 8000 broken links. The csv file wouldnt even open because there were so many.
Has anyone ran a linkchecker on modules they have downloaded if so how did you solve it?
Hi @david.gomez â thatâs not surprising, there is only so much we can do when making an offline copy of a site, both functionally because itâs difficult and legally, because in many cases we are required to keep certain âabout usâ attributions in the sites we, or others, scrape. Much of that content is created by volunteers so there are bound to be some links that get missed.
Iâm assuming youâre doing this all offline. You could have one attribution in the âcopyrightâ section of saylor academy for instance which is repeated 8,000 times easily. There are millions and millions of links here, many of them may have had 404 errors when we copied the sites as well.
Thanks for the reply @jeremy . Yes, I understand I just wanted to make sure I wasnât the only one seeing this issue. I couldnât find anything on the forum about it.