I've been excited to write javascript code in deno, a runtime that promised a well thought out and stable standard library. I looked forward to updating my code to match their version 1.0 announced with some fanfare. This turned out to be more work than I expected.
My application crawls the wiki federation which can take half a day but then updates in 15 minutes by incrementally updating the previous results.
The startup logic asks two questions: do I have data from previous runs, and, what sites were included then? I sought help coding this from three friends. We got the code working again after this and similar changes.
async function preload(root:site) {
visit = 0
skip = 0
- let files = await Deno.readdir('data')
- if (files.length > 0) {
- scrape(files.map(i=>i.name))
- } else {
+ let files = Deno.readDir('data')
+ let some = false
+ for await (let each of files) {
+ some = true
+ scrape([each.name])
+ }
+ if (!some) {
scrape([root])
}
}