Hi lemmy; I’ve made a program that will download all the post and images for a lemmy community. It downloads them into a html file. This way users don’t have to worry about losing all their post because they can download the whole community themselves.
I hope someone finds this useful. Someone asked me for it so I provided it.
Everything about Lemmy; bugs, gripes, praises, and advocacy.
For discussion about the lemmy.ml instance, go to !firstname.lastname@example.org.
Thanks for this. It’s Thief btw ;-)
*Thief btw (thanks for making it!)
I made some improvements.
Just refactored it a good bit. Should be easier to read through the logic now. Error handling is better
nice. i’m a bit busy irl with family and work, will have a go at implementing comment backup once i get back.
That would be a lot of CSS so I have been avoiding it. Would be awesome if you get it to work though.
lol, you’re right i have to build the comment tree myself too, looks like a lot of work i think i’ll pass too
imo comments aren’t that important. Maybe for tech subs. Feel like lemmy needs moderation tools the most. I might look at one of those to build out. Playing with the dev environment today.
I put paging back so large communities won’t use up to much memory. I made limit optional and it default at 10 so it will download 10 at a time. I like how you changed it to download all images. It’s all merged. Gonna try to add a basic license tonight too.
i don’t see how that helps since the user has no control over the paging. you’re making a lot of additional requests to get the same data
You dont want to get 1,000 post in one request. Im not even sure lemmy would allow that. Im not sure what you mean by no control. It’s downloading an entire community.
And yes you want to make additional request so it doesn’t timeout.
actually lemmy allows getting any number of posts with one request.
I still don’t think it’s a great idea to get an entire large community at once. Ill play with trying an entire large community like memes later when I get a minute. Either way the way I have it you can still specify a really large limit. Pass in 700 and it will attempt 700 at once. Pass in 50 and it will do it in chunks of 50. In either case it will get the entire community and not stop at the number provided .
edit: Manages to get all 601 post from /c/memes in a few seconds. Which is cool. I’m just gonna make the default page size at like 100 instead of 10.
looks good thanks. I’ll merge this!
Downloading a user should be a helpful feature too.
Unrelated, but someone should also try a Lemmy comment archiver like one of those services for reddit.
Seems like a very useful project. You can add a link to it in the Lemmy docs if you want.
Btw instead of copy-pasting the structs from Lemmy into your project, you could pull them directly from crates.io. Disadvantage is that it will force you to pull in lot of dependencies that have to be compiled, but that could probably be avoided in the future by adding feature flags to Lemmy crates.
Good to know
Originally I just took the json output and stuck it through a rust struct generator here https://transform.tools/json-to-rust-serde
Then I looked at lemmy code to get the time format and check which values needed to be optional.
looks interesting please add a license
Which one you want in it?
don’t know, i would love to host it somewhere else if i make some changes, anything that permits that will be nice
so can we do this with our proflies and therefore download all our posts
It currently downloads a entire community and not a single user. I guess we could add that if that’s more interesting to you.
could be useful, I back my stuff up prior to posting but also if my computer and backups crashed I guess this would serve as an additional backup
it just downloads the pics and titles right tho, not text within text posts?
Very cool I’ll be sure to give this a try when I get a chance.
Thanks for creating this!