how to archive old reason threads please
hello, how do users plan on archiving post/threads from the old reason forum before it closes?
any ideas?
(this place already feels cool and friendly with lots of features...a great job:congrats, Lunesis)
cheers, j
any ideas?
(this place already feels cool and friendly with lots of features...a great job:congrats, Lunesis)
cheers, j
littlejamaicastudios
i7 2.8ghz / 24GB ddr3 / Quadro 4000 x 2 / ProFire 610
reason 10 / reaper / acidpro /akai mpk mini / korg padkontrol / axiom 25 / radium 49
'i get by with a lot of help from my friends'
i7 2.8ghz / 24GB ddr3 / Quadro 4000 x 2 / ProFire 610
reason 10 / reaper / acidpro /akai mpk mini / korg padkontrol / axiom 25 / radium 49
'i get by with a lot of help from my friends'
Not sure, I suppose I'll sticky the big refill thread in Advanced forum, but all I can really think of is to copy and paste the good ones since they're deleting everything.
Ok, I got Ed to post the big refill sticky in the RE/R forum, a good start I suppose!
About the spider.. I'm working on setting it up. Its a bit tougher then just throwing in a url... (ofcourse)
I think i have authentication done via some cookie magic so now its on to the exclude list.
I thought i'd exclude the following fora:
- Post your Music
- Music Forum
- Mobile Apps Forum
But that still leaves me with a grand total of about 1.1 million posts. At 10 posts a page this may be a bit much. So i'm not quite sure if i want to do that.
I've been looking at ways to download more posts at once. The max i can get is 40 via the print view. Unfortunately, coaxing the page to give more doesn't work.
More on this later..
[edit]
Maybe we can appeal to the props to give us the DB for the posts.
personally i can't stand it to throw away a good source of information. Feels like burning a library. You just don't do that :S
I think i have authentication done via some cookie magic so now its on to the exclude list.
I thought i'd exclude the following fora:
- Post your Music
- Music Forum
- Mobile Apps Forum
But that still leaves me with a grand total of about 1.1 million posts. At 10 posts a page this may be a bit much. So i'm not quite sure if i want to do that.
I've been looking at ways to download more posts at once. The max i can get is 40 via the print view. Unfortunately, coaxing the page to give more doesn't work.
[edit]
Maybe we can appeal to the props to give us the DB for the posts.
personally i can't stand it to throw away a good source of information. Feels like burning a library. You just don't do that :S
V9 | i7 5930 | Motu 828 MK3 | Win 10
That's exactly what I think! Bit like "burning books", it's just WRONG!eauhm wrote:personally i can't stand it to throw away a good source of information. Feels like burning a library. You just don't do that :S
You're doing great work Eauhm! All-Power-to-ya
- jfrichards
- Posts: 1319
- Joined: 15 Jan 2015
- Location: Sunnyvale, CA
eauhm, is it possible to filter out some PUF threads or posts for transfer? One example would be to transfer all the Selig posts because they are so heavily technical in content. There may be some other people like that (maybe not), possibly Peff, or James Bernard.
Individuals could maybe copy over any posts that they made or that they really appreciate that are of a purely technical nature, maybe have a Technical Tips or Ask the Experts section over here to park it all.
Individuals could maybe copy over any posts that they made or that they really appreciate that are of a purely technical nature, maybe have a Technical Tips or Ask the Experts section over here to park it all.
- frog974new
- Posts: 352
- Joined: 16 Jan 2015
- Contact:
its seems the better way to filter somes famous topic by somes users ( all the most community ressources on the PUF )jfrichards wrote:eauhm, is it possible to filter out some PUF threads or posts for transfer? One example would be to transfer all the Selig posts because they are so heavily technical in content. There may be some other people like that (maybe not), possibly Peff, or James Bernard.
Individuals could maybe copy over any posts that they made or that they really appreciate that are of a purely technical nature, maybe have a Technical Tips or Ask the Experts section over here to park it all.
Selig, Noel, Benedict and many more folks always had some good valuable info. If they are willing to move all their valuables over, then we can re-start a new forum life. I guess the rest of the info will have to slowly re-appear in increments, or as people start asking new questions. Knowledge is powerful and it really feels like PH didn't think this through clearly, but they are doing us a disservice. I hope things come together nicely in this new place, by the looks of it, it seems like it will. Kudos and huge thanks go to you Lunesis. You're a life saver!
Guts Electronic Mayhem
hello,
it might be cool if we could all use the same 'spider program' to download our own work or those of others and then consolidate here
share the work load to make sure we get what's most valuable
i'm not sure if this is possible or not, but the knowledge should be preserved
the PUF was like going to audio school for me
cheers, j
it might be cool if we could all use the same 'spider program' to download our own work or those of others and then consolidate here
share the work load to make sure we get what's most valuable
i'm not sure if this is possible or not, but the knowledge should be preserved
the PUF was like going to audio school for me
cheers, j
littlejamaicastudios
i7 2.8ghz / 24GB ddr3 / Quadro 4000 x 2 / ProFire 610
reason 10 / reaper / acidpro /akai mpk mini / korg padkontrol / axiom 25 / radium 49
'i get by with a lot of help from my friends'
i7 2.8ghz / 24GB ddr3 / Quadro 4000 x 2 / ProFire 610
reason 10 / reaper / acidpro /akai mpk mini / korg padkontrol / axiom 25 / radium 49
'i get by with a lot of help from my friends'
Ok, here's an update on my effort to try to download forum content.
Its a pain in the b*** to be honest.
I got the login issues sorted. Then the site needs to be in a certain layout because i don't want to have to parse javascript. Sorted.
Then i want to minimize the amount of pages to be downloaded, so i want as many posts on one page as possible. Keep in mind that pages (cross)references many others. So when the spider is building its indexes, it has to decide what to skip (print layouts for example), what to download... etc etc.. Its a lot to sift through. I'm not done configuring yet, and i'm not sure i'll be able to get it sortof right either.
About the suggestions,
Only downloading posts by certain users is going to make this even more complex unfortunately.
Also, compiling a site out of bits and pieces all of us want to save is going to be a tricky task. I'd love to be able to do that for the community but.... the prospect doesn't really fill me with much cheer.
Its tough. I ow a lot of progress to many a post on the PUF. Whats written in there feels like a hand written personally tailored manual by people i've come to respect. It doesn't feel right that this can just be tossed aside.
But, on the other hand i also have to be honest with myself and acknowledge the fact that after reading the thread when it happens, i don't read it back later. It may happen once every 6 months but thats it.
So, what now?
I'll keep going at it for a bit more.
We may try asking for a bit more time with the forums to get the extract done. Maybe we can convince Props to keep it running somewhere in read only but with altered settings that make extractions easier. (i'm thinking of unlimited posts per page for example...)
All in all i find the short notice a bit harsh. But i can relate that once such a decision is made it may just be best to just rip off the band-aid and be done with it because now is as good a time as any.
Its a pain in the b*** to be honest.
I got the login issues sorted. Then the site needs to be in a certain layout because i don't want to have to parse javascript. Sorted.
Then i want to minimize the amount of pages to be downloaded, so i want as many posts on one page as possible. Keep in mind that pages (cross)references many others. So when the spider is building its indexes, it has to decide what to skip (print layouts for example), what to download... etc etc.. Its a lot to sift through. I'm not done configuring yet, and i'm not sure i'll be able to get it sortof right either.
About the suggestions,
Only downloading posts by certain users is going to make this even more complex unfortunately.
Also, compiling a site out of bits and pieces all of us want to save is going to be a tricky task. I'd love to be able to do that for the community but.... the prospect doesn't really fill me with much cheer.
Its tough. I ow a lot of progress to many a post on the PUF. Whats written in there feels like a hand written personally tailored manual by people i've come to respect. It doesn't feel right that this can just be tossed aside.
But, on the other hand i also have to be honest with myself and acknowledge the fact that after reading the thread when it happens, i don't read it back later. It may happen once every 6 months but thats it.
So, what now?
I'll keep going at it for a bit more.
We may try asking for a bit more time with the forums to get the extract done. Maybe we can convince Props to keep it running somewhere in read only but with altered settings that make extractions easier. (i'm thinking of unlimited posts per page for example...)
All in all i find the short notice a bit harsh. But i can relate that once such a decision is made it may just be best to just rip off the band-aid and be done with it because now is as good a time as any.
V9 | i7 5930 | Motu 828 MK3 | Win 10
Would it be possible to use https://www.propellerheads.se/forum/search.php to search for posts by one particular user. There is an option to get results as posts rather than entire threads.
Then, perhaps https://www.propellerheads.se/forum/showpost.php could be used to get single posts. That would yield ONE post at at time, which is kind the opposite you'd want to spider the entire forum, but to download the work of a handful of users perhaps?
What to do, then, with a ton of posts out-of-context is another story. Perhaps it would be possible to ask those who originally made those posts so sift through them and sort out those with, well, self-contained explanations of stuff. That, could then, in turn, perhaps be handed over to enthusiasts as material for a collection of tutorials.
Oh, well, perhaps I'm just rambling here... I'm trying to be creative, I suppose.
Then, perhaps https://www.propellerheads.se/forum/showpost.php could be used to get single posts. That would yield ONE post at at time, which is kind the opposite you'd want to spider the entire forum, but to download the work of a handful of users perhaps?
What to do, then, with a ton of posts out-of-context is another story. Perhaps it would be possible to ask those who originally made those posts so sift through them and sort out those with, well, self-contained explanations of stuff. That, could then, in turn, perhaps be handed over to enthusiasts as material for a collection of tutorials.
Oh, well, perhaps I'm just rambling here... I'm trying to be creative, I suppose.
I'm up for helping with the process, but don't want to double up on anyone else's work. Is the some way we could join forces and distribute/organize the work?
[…just noticed I have 140 attachments that will also be lost - time to work on getting those sorted, as I should still have most of them on my local drive here. Just need to figure out which ones belong to which posts… ;( ]
[…just noticed I have 140 attachments that will also be lost - time to work on getting those sorted, as I should still have most of them on my local drive here. Just need to figure out which ones belong to which posts… ;( ]
Selig Audio, LLC
I've done several tests now downloading the entire forum...
With the tools i'm familiar with i'm not going to be able to automate it. Sorry.
I've worked with httrack and wget. One of the problems (as i mentioned earlier) is setting up a good exclude list to make sure mirroring doesn't go haywire... I've got that somewhat under control but the other problem i have is that links in pages need to be rewritten to reflect that they change from a dynamic site into static pages. I may be able to script that but its going to take quite a bit of postprocessing after i've downloaded all the files. I'm sorry but my project to try to slurp in the entire site will be stopped.
Ofcourse, its software, so anything is possible... and i love you all 'n such.. but i rather make music (even if i'm not good at it)
With the tools i'm familiar with i'm not going to be able to automate it. Sorry.
I've worked with httrack and wget. One of the problems (as i mentioned earlier) is setting up a good exclude list to make sure mirroring doesn't go haywire... I've got that somewhat under control but the other problem i have is that links in pages need to be rewritten to reflect that they change from a dynamic site into static pages. I may be able to script that but its going to take quite a bit of postprocessing after i've downloaded all the files. I'm sorry but my project to try to slurp in the entire site will be stopped.
Ofcourse, its software, so anything is possible... and i love you all 'n such.. but i rather make music (even if i'm not good at it)
V9 | i7 5930 | Motu 828 MK3 | Win 10
What would be really cool is to to just archive the meat and skip over any bad stuff in the topics, just the tips and tricks..
Could they be stickied in a dedicated PUF subforum?
Not sure archiving entire things wouldn't bring back some bad memories here, just the good helpful stuff is what we should be after. Just IMO of course.
Oh and anything I can do to do my part would be cool. Just let me know, i'm up for it. Cheers
Could they be stickied in a dedicated PUF subforum?
Not sure archiving entire things wouldn't bring back some bad memories here, just the good helpful stuff is what we should be after. Just IMO of course.
Oh and anything I can do to do my part would be cool. Just let me know, i'm up for it. Cheers
Mac Studio M2 Ultra/64Gb/Apollo T-Bolt 3/OS 14.6.1/PT 2024.6/R13.02/Logic 11.01
MSI GT77/13980HX/RTX 4090m/64GB/Arturia Minifuse 2/PT 2024.6/R13.02/Low DPC latency tuned
MSI GT77/13980HX/RTX 4090m/64GB/Arturia Minifuse 2/PT 2024.6/R13.02/Low DPC latency tuned
The fastest way to get the content of a single thread by hand would be to open the thread, click the thread tools on top, click "Show Printable Version" and then click "show 40 posts from this thread on one page".Bavanity wrote:What would be really cool is to to just archive the meat and skip over any bad stuff in the topics, just the tips and tricks..
Could they be stickied in a dedicated PUF subforum?
Not sure archiving entire things wouldn't bring back some bad memories here, just the good helpful stuff is what we should be after. Just IMO of course.
Oh and anything I can do to do my part would be cool. Just let me know, i'm up for it. Cheers
V9 | i7 5930 | Motu 828 MK3 | Win 10
I and a few others posted many entire threads dedicated to a theme. But even in those specific cases I don't feel it's that important to document every written word.Bavanity wrote:What would be really cool is to to just archive the meat and skip over any bad stuff in the topics, just the tips and tricks..
Could they be stickied in a dedicated PUF subforum?
Not sure archiving entire things wouldn't bring back some bad memories here, just the good helpful stuff is what we should be after. Just IMO of course.
Oh and anything I can do to do my part would be cool. Just let me know, i'm up for it. Cheers
I feel it's most important to save any useful tutorials, graphs, data (like dB to 127 conversion stuff), patches/combinators, etc.
Others may have different objectives, and that's a good thing IMO, but that's where I'm going to focus. I'm also hoping the Props will hear my plea of keeping the forums online for a month or so longer - we'll see.
Selig Audio, LLC
I used Sitesucker and it can grab as deep as you want. But in my test, the downloaded site archive of the Advanced Users forum wanted a login to view the user forums, even though all the content was on my drive. I know there's a way to get passed that limitation. But as far as how much content is there in the forum, check this out:
I think the only way to do this is to archive the entire Prop Forum with Sitesucker or wget, etc. By the way I saw a post on PUF asking for a phpBB expert here for this forum. I used to customize and manage several phpBBs, but I don't have time now. If you get stuck on mods or customizing, I can see if I have any answers.
I think the only way to do this is to archive the entire Prop Forum with Sitesucker or wget, etc. By the way I saw a post on PUF asking for a phpBB expert here for this forum. I used to customize and manage several phpBBs, but I don't have time now. If you get stuck on mods or customizing, I can see if I have any answers.
- Attachments
-
- Screen_Shot_2015-01-16_at_7.46.47_PM.png (145.73 KiB) Viewed 3850 times
@Wendulou: Did you get the whole thing already? Is it all done or do we need to get active here?
In regards to the phpBB expert search: I think that was Theo looking for his alternate forum...
There was a confusion about who had secured which domain name...
D.
In regards to the phpBB expert search: I think that was Theo looking for his alternate forum...
There was a confusion about who had secured which domain name...
D.
I only did a test crawl. My biggest concern is how to get around the login requirement, which fails to work when the content is local. Also, a deep crawl with unlimited depth can get really crazy and may never finish successfully unless you limit how deep and set limits on things like external links crawling, file types such s videos on YouTube (no need to download all that content), etc. Sitesucker is a great tool for archiving an entire, deep website, but I'm concerned about going through the whole effort if one cannot log into the forum when it is located on a local hard drive rather than at propellerhead. So if anyone else wants to do a deep crawl that is not broken afterwards by nonfunctional logins, go for it. Unless that problem can be resolved, it makes no sense to archive the the entire forum.
-
- Information
-
Who is online
Users browsing this forum: Jagwah and 6 guests