@barefootstache@qoto.org avatar

barefootstache

@barefootstache@qoto.org

I am a strong proponent of leaving this planet better behind than when I arrived on it. Thus to get the most bang for a lifetime my key focus is #longevity which I attempt to achieve with #nutrition specifically #plantbased.

Longevity is good and all as long as you are not frail and weak. Ideally would be to die young at an old age. Thus I incorporate tactics from #biohacking and #primalfitness. Additionally I am an advocate of #wildcrafting, which is a super set of #herbalism.

Studied many fields of science like maths or statistics, though the constant was always computer science.

Currently working as a fullstack web developer, though prefer to call myself a #SoftwareCrafter.

The goal of my side projects is to practice #GreenDevelopement meaning to create mainly static websites. The way the internet was intended to be.

On the artistic side, to dub all content under the Creative Commons license. Thereby, ideally, only using tools and resources that are #FLOSS #OpenSource. #nobot

This profile is from a federated server and may be incomplete. Browse more on the original instance.

barefootstache, to random
@barefootstache@qoto.org avatar

(300/300)

Another checkpoint has been reached!

In the last 50 entries most were pre-written and most were part of a thread. The flexibility given in 200 was definitely a game changer and reduces stress quite much.

One thing that I have noticed, although the entry already existed, the act of posting was the bottleneck.

Another thing that I changed is the way that the entries are bundled. Initially wrote each entry separate and then linked them after the fact.

Now instead I write them all in the same file as long as they are part of the same thread. This eases the publishing, the writing, and linking. Additionally, with this method, I can add references to the end of the file and be one step closer if I ever choose to publish long form.

Well, off to another 65 days to complete a typical year. Technically this year would have 366 days.

jpeelle, to random
@jpeelle@neuromatch.social avatar

No one in my lab has ever heard of "version control” so I'm making it my mission to educate them. Does anyone have suggested resources for beginners?

barefootstache,
@barefootstache@qoto.org avatar

@freemo

Sounds like the same concepts as in . To reflect on the git environment: HackTheGit

@jpeelle

barefootstache, to random
@barefootstache@qoto.org avatar

Looks like is now stable after three weeks of upgrades. The have been somewhat out of whack for the month of February, thus to put all back in sync, will be posting twice a day for the next weeks.

barefootstache, to random
@barefootstache@qoto.org avatar

(168/200)

Have you just over exposed your hands to the cold and don’t have hand warmers, then there are two locations that can help. First stick your hands under your armpits and second is by your groin. Additionally, to retain more warmth make yourself as small as possible and try find shelter from the elements like hide behind a wall for wind protection.

These are two areas that the body tries to keep alive for biological reasons. Thus this trick works great in a pinch.

barefootstache, to random
@barefootstache@qoto.org avatar

#DailyBloggingChallenge (153/200)

There are two main ways to #scrape a #website, either actively or passively.

Active scraping is the process of using a trigger to actively scrape the already loaded webpage.

Passive scraping is the process of having the tool navigate to the webpage and scrape it.

The main difference is how one is getting to the loaded #webpage.

#WebsiteScraping

barefootstache,
@barefootstache@qoto.org avatar

#DailyBloggingChallenge (154/200)

To passively scrape a webpage one uses automation tools, ideally headless browsers like #Selenium or #Puppeteer. Of course one can use any tool that is typically used for #e2e testing in the #browser.

The biggest obstacle for passively scraping is dealing with either #captcha or #cloudflare.

There are options to use captcha farms for a small monetary fee. And Cloudflare can be over come by IP hopping.

In general, passively scraping only works on websites that were poorly configured.

barefootstache,
@barefootstache@qoto.org avatar

(155/200)

To actively scrape a one employs either an extension or uses the console.

Here the difference is where and who maintains the code. The benefit of using the is that one is browser agnostic and still can keep a level anonymity. Whereas with an extension could be used as a fingerprint marker.

E.g. if using the browser one should not diverge from the installed extensions, since one will easier identified compared to the herd. Using the console would be preferred in this case.

On the flip side using an extension voids the need to copy and paste the code into the console every time.

barefootstache,
@barefootstache@qoto.org avatar

(156/200)

The question persists why one should learn how to scrape? The obvious answer is to get data from the webpage. Though further reasons are to learn how to evaluate a website and then build extensions to present the page to one’s liking.

Although web scraping might have a negative connotation, how much different is it from skimming literature and choosing the specific patterns. And with AI/LLM on the rise, now one can evaluate texts even quicker.

barefootstache,
@barefootstache@qoto.org avatar

(157/200)

When actively scraping, the main starting function is

document.querySelectorAll()

This will return a NodeList, which typically one will use a for-loop to loop over each item.

On each item either the querySelector or querySelectorAll will be applied recursively until all specific data instances are extracted.

This data is then saved into various formats depending on future processing, either as on object in an array or as a string, which is then saved either to the localStorage, sessionStorage, IndexDB, or downloaded via a temporal link.

barefootstache,
@barefootstache@qoto.org avatar

(158/200)

One option for future processing is opening a new tab as page.

This has the benefit that the header details stay constant meaning calling media like images aren’t being blocked by . Further one can highlight the details that one deems important compared to the original creator.

One builds the HTML page as a string, just as one typically would do. The only difference is that the file extension is *.js instead of *.html.

barefootstache,
@barefootstache@qoto.org avatar

(159/200)

This function builds a website from scratch with the body parameter being the only necessary input.

  /**   * Opens a new window with a 'title'.   *   * @param body - the body of the HTML page   * @param style - the style of the HTML page   * @param title - the title of the HTML page   * @param script - the javascript of the HTML page   */  static openNewWindow(body:string, style = '', title="new display", script=''):true {    const mywindow = window.open('', '_blank') as Window;    mywindow.document.write(`<html><head><title>${title}</title>`);    mywindow.document.write(`<style>${style}</style>`);    mywindow.document.write('</head><body>');    mywindow.document.write(body);    mywindow.document.write('<script>');    mywindow.document.write(script);    mywindow.document.write('</script>');    mywindow.document.write('</body></html>');    mywindow.document.close(); // necessary for IE >= 10    mywindow.focus(); // necessary for IE >= 10*/    return true;  }

This can be use as a way to display the scraped data.

barefootstache, to random
@barefootstache@qoto.org avatar

(147/200)

As an active participant of the @weeklyOSM project which is celebrating its 700th weekly news update, one gets to admire a long lasting community project.

This would put the first edition almost 14a ago, which is only a couple years after the project started.

There are a lot of people working behind the scenes of gathering the news stories, writing up a small summary, translating these into the variously languages, proof reading, and finally publishing at the end of the week.

barefootstache, to random
@barefootstache@qoto.org avatar

(136/200)

TIL in psychology that we retain all information up to 0.5s before filtering away the unnecessary data.

This was shown through an experiment where subjects were shown 4 characters for 50ms and asked to recall the information. The active recall is where most of the data transfer was lost.

barefootstache,
@barefootstache@qoto.org avatar

(137/200)

When looking at short term memory, a portion of it is the active working memory.

One way to look at the active working memory is like matrix multiplication.

Let’s say one gets a 5 digit number. It is fairly simple to return that number in the same order as given, which would equate to the identity matrix.

If one had to invert the order of number, then the diagonal of the matrix would be flipped compared to the identity matrix, so equating to a negative determinant.

Now imagine if one had to do more complex calculations like ordering the months of the year alphabetically. Creating a matrix that transposes the months vector is initially not that simple, though once instantiated, it becomes quite simple to repeat.

Depending on how many row alterations are needed, starting from the identity matrix could be a way to quantify the complexity of the task.

barefootstache, to random
@barefootstache@qoto.org avatar

(135/200)

Some times one questions how the price of an item was decided on. Like take for example pickles, there are multiple options at the store although they all have basically the same ingredients.

One could claim one is paying extra for the brand. That would be fine if the brand was actually good, though lots of times, I have found the no name brand or store brand is better than some specific big name brand.

barefootstache,
@barefootstache@qoto.org avatar

(138/200)

Over the past two days, I was living off of 5min quick make foods meaning one just needs to add hot water, stir, and wait.

Each meal/cup had a dry weight of about 60g and ca 250g wet weight. And it would take about 3 cups to be satiated.

Considering that the price per cup ranged from 0.80-3.50€ and the convenience factor, it can be quite an expensive luxury, depending on what price tag one puts on one’s own working time.

If one’s working time cost is 0.00€, then one is looking at 10.50€ per meal. Now considering one would buy the raw ingredients, which would on average cost only a fraction of the price, then one quickly realizes the luxury of such quick convenience foods.

barefootstache,
@barefootstache@qoto.org avatar

(139/200)

Let p be price of cup and w be hourly work cost. Then the cost C of preparation per cup is

C = p + w * t

for time t in hours.

If we take a look at the three cup example for satiation of a meal M.

M = 3 * C = 3 * (p +  w * 0.083)

Let’s take a look at various price points p = {0.8, 1.5, 3.5} and various hourly wages w = {20, 50, 100}. Then this gives us the table WP.

w\p 0.8 1.5 3.5
20 7.4 9.5 15.5
50 14.5 17 23
100 27.4 29.5 35.5

Now considering that whole food costs a fraction p_f of the price of convenience food, though in return needs more prep time t_f.

This gives us the new equation of

C_f = p_f * p + w * t * t_f

and

M_f = 3 * C_f = 3 * (p_f * p + w * t * t_f)

We will look at p_f = {0.5, 0.8} and t_f = {5, 10}; only on the last two columns of the previous table WP.

First we will multiply p_f * p and t * t_f, so that we can use the equation M.

Let table P be

p_fp 1.5 3.5
0.5 0.75 1.75
0.8 1.2 2.8

and table T be

t_ft 0.083
5 0.42
10 0.83

This give us the equation

M_f = 3 * C_f = 3 * (P + w * T)

and the table WTF

wt_f 0.42 0.83
20 8.4 16.6
50 21 41.5
100 42 83

with this we can finally calculate the table WPF

w*t_fp_f 0.75 1.2 1.75 2.8
8.4 27.45 28.8 30.45 33.6
16.6 52.5 53.85 55.5 58.65
21 65.25 66.6 68.25 71.4
42 130.5 131.85 133.5 136.65
83 251.25 252.6 254.25 257.4

since 41.5 and 42 are close, we omitted the prior one.

The table WPF shows that considering one’s own wage price for food prep, then a meal becomes quite expensive.

So next time one complains how expensive a meal is at the , one can compare how expensive it would be if one prepped it oneself.

ceoln, to Minecraft
@ceoln@qoto.org avatar

Bored with everything (I can afford heh heh) this weekend, so I started a new world. My start point is at the edge of a jungle, which is a first. Also there was a patch of wild watermelons right nearby, so I've been living entirely on watermelon slices. :)

barefootstache,
@barefootstache@qoto.org avatar

@ceoln imagine doing that IRL

freemo, to random
@freemo@qoto.org avatar

is going to block @childlove.su as I have witnessed open approval of pedophilia that has not been acting on.

@khird @barefootstache

barefootstache,
@barefootstache@qoto.org avatar
barefootstache, to random
@barefootstache@qoto.org avatar

#DailyBloggingChallenge (105/200)

The tool that I use the most when doing detailed natural and landuse mapping in #JOSM is UtilsPlugin2, especially the Split Object tool.

The reason being that one saves clicks. Let’s say one has 4 famlands in a grid layout. Using the basic way of drawing each farmland separately one needs 16 clicks. Now using split object tool one needs 8-12 clicks, depending if one uses other tools as well.

In this simpel example there is already click count reduction. Now imagine instead of an area with 4 node one has one with 200 nodes and one wants to split it in half. Then in the worst case one is clicking through the nodes and adding $2*nodeCountSplitLine$. So if it’s a straight which has at minimum 2 nodes, Thus making us click 204 nodes. Now instead of a 2 node line, one had a 50 node line, then one would have to click 300 times, whereas with the Split Object tool it would stay at 250.

#OpenStreetMap

lowqualityfacts, to random
@lowqualityfacts@mstdn.social avatar

I'm surprised that so many people don't know this.
https://patreon.com/lowqualityfacts

barefootstache,
@barefootstache@qoto.org avatar

@freemo

Can't tell if the flavoring really matters, if the hardship exists in the texture

@lowqualityfacts

barefootstache, to hiking
@barefootstache@qoto.org avatar

(80/100)

Choosing the right tool for the task depends on the future requirements.

Take for example on a puddle filled path. Ideally, one would either have waterproof boots or an extra pair of shoes to change into.

If neither is applicable, then one can try to construct an optimal route with various hops to mitigate as much water contact as possible. Or one could just as well take off the shoes plus socks and do the trail .

Doing a trail barefoot is no simple feat, since one needs to regularly train the foot muscles and harden the foot sole skin. Otherwise one will tire out too quickly and/or cannot keep the pace of the other hikers. Additionally, one will need to put a stricter focus on where one steps, not only to avoid additional pain from landing on sharp rocks, but also to avoid unnecessary slipping and twisting of the foot. Thus, if possible one should avoid stepping on mud, first because unknown objects can be hiding underneath it and secondly it can create a slick surface, thereby increasing the potential of falling and/or injury.

tiffanycli, to random
@tiffanycli@mastodon.social avatar

Is there any Mastodon app that lets you customize how many words you see per post? My attention span is asking for faster scrolling 🤐

barefootstache,
@barefootstache@qoto.org avatar

@zleap @freemo @tiffanycli @QOTO

various apps have also various character post limits, so if one would like to write a long post on qoto, then one will have to write it in a 3rd party app and copy-paste it into the mastodon app of one’s choice.

freemo, to random
@freemo@qoto.org avatar

LOL someone on facebook just told my my "language is not welcome" because I used fuck in a post (not in any way that was directed at a person)...

Well lady, I see this is your first day on the internet.. please let me introduce you to the back where the real shit goes down....

barefootstache,
@barefootstache@qoto.org avatar

@freemo how far back are we talking about?

freemo, to random
@freemo@qoto.org avatar

Conspiracy theorists: Man everyone is a sheep, they just believe everything told to them.

All conspiracy theorists in unison: No one has ever isolated a virus.

barefootstache,
@barefootstache@qoto.org avatar

@freemo the beauty of selective listening

  • All
  • Subscribed
  • Moderated
  • Favorites
  • megavids
  • thenastyranch
  • rosin
  • GTA5RPClips
  • osvaldo12
  • love
  • Youngstown
  • slotface
  • khanakhh
  • everett
  • kavyap
  • mdbf
  • DreamBathrooms
  • ngwrru68w68
  • provamag3
  • magazineikmin
  • InstantRegret
  • normalnudes
  • tacticalgear
  • cubers
  • ethstaker
  • modclub
  • cisconetworking
  • Durango
  • anitta
  • Leos
  • tester
  • JUstTest
  • All magazines