mirror of
https://github.com/by-jp/www.byjp.me.git
synced 2025-08-09 05:36:07 +01:00
Add tags & debug missing article
This commit is contained in:
parent
93e3da5495
commit
c7c0dd9aa3
9 changed files with 164 additions and 15 deletions
|
@ -25,6 +25,9 @@ references:
|
|||
dopamine-and-value-of-work:
|
||||
url: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4696912/
|
||||
name: Mesolimbic Dopamine Signals the Value of Work
|
||||
tags:
|
||||
- curious
|
||||
- biology
|
||||
---
|
||||
|
||||
I really enjoyed reading about dopamine, desire and pleasure here! It’s worth a read, even if it slightly veers into the “how to abuse this knowledge for personal gain with your new product” space.
|
||||
|
|
|
@ -11,6 +11,9 @@ references:
|
|||
summary: Solitude, creativity, Bergman, Grothendieck, and the pursuit of great
|
||||
ideas.
|
||||
author: Henrik Karlsson, Johanna Wiberg
|
||||
tags:
|
||||
- self-help
|
||||
- creativity
|
||||
---
|
||||
|
||||
|
||||
|
|
|
@ -9,6 +9,8 @@ references:
|
|||
type: entry
|
||||
name: Google shattered human connection
|
||||
summary: Open Source Freelancer
|
||||
tags:
|
||||
- tech
|
||||
---
|
||||
|
||||
|
||||
|
|
|
@ -11,6 +11,7 @@ references:
|
|||
summary: examines claims that the big five personality traits is more scientifically
|
||||
valid than the myers briggs personality indicator
|
||||
author: dynomight
|
||||
tags: []
|
||||
---
|
||||
|
||||
I fit the eNxP model by this article's assertions, and it feels representative enough to be useful. I like the idea of these tools to label & understand yourself _particularly_ because you can figure out where you _don't_ fit the model/where the label _isn't_ useful.
|
||||
|
|
72
content/bookmarks/neologisms.md
Normal file
72
content/bookmarks/neologisms.md
Normal file
|
@ -0,0 +1,72 @@
|
|||
---
|
||||
title: Neologisms
|
||||
date: "2024-03-03T08:57:01Z"
|
||||
bookmarkOf: https://maggieappleton.com/neologisms
|
||||
references:
|
||||
bookmark:
|
||||
url: https://maggieappleton.com/neologisms
|
||||
type: entry
|
||||
name: Neologisms
|
||||
summary: A collection of interesting words that have recently been coined
|
||||
tags:
|
||||
- words
|
||||
---
|
||||
|
||||
I love creating or finding words for niche (or expanding) concepts and areas. These highlighted ones are particularly great!
|
||||
|
||||
### Highlights
|
||||
|
||||
> #### Neofeudalism
|
||||
>
|
||||
> The new Feudalist system we all rely upon for online security. No single person or company can defend themselves against hackers, attackers and trolls without aligning themselves with one of the monolithic fortresses (Apple, Facebook, Google, or Microsoft). We hide behind their enormous cybersecurity teams. We're required to live within their walls and tolerate whatever surveillance suits their business model. It's feudalism, with likes.
|
||||
|
||||
I feel like Cory Doctorow is good at giving catchy names to concepts we don’t entirely think about, but probably should. (“Enshitification” will live the lifetime of capitalism because of him!)
|
||||
|
||||
---
|
||||
|
||||
> #### Metarationality
|
||||
>
|
||||
> Having epistemological awareness of _how_ you know what you know, _how much_ you know about it, and how much you should _defer to experts_ instead of your own judgement.
|
||||
>
|
||||
> We should all aim for at least mediocre metarationality.
|
||||
|
||||
A great example of how it’s good to have a word, just so you’re solidly aware of the concept behind it.
|
||||
|
||||
---
|
||||
|
||||
> #### Milquetoast
|
||||
>
|
||||
> Vanilla, bland, flavourless Normie behaviour.
|
||||
>
|
||||
> Someone might be Milquetoast if the primary features of their personality centre around liking avocados, watching Game of Thrones, and "travel"
|
||||
|
||||
Is this a neologism‽ I thought this was a pre-information age expression!
|
||||
|
||||
---
|
||||
|
||||
> #### Theoretical Graffitiability
|
||||
>
|
||||
> The degree to which an academic theory can be captured in graffiti in public space. I coined this one after someone posted a photo of the [futures cone](https://sjef.nu/theory-of-change-and-the-futures-cone/) in spray paint.
|
||||
|
||||
---
|
||||
|
||||
> #### Yak Shaving
|
||||
>
|
||||
> Embarking on a sequence of nested tasks to accomplish a goal, where each step seems logical and necessary in the moment, but becomes less and less linked to the original goal.
|
||||
>
|
||||
> You set out to fix a broken image in your code, which leads to refactoring the image rendering function, which requires updating your npm packages, but first you need to plan this all out in Jira, and then install the latest version of Adobe Flash, and on and on in seemingly logical sequence until you find yourself in a zoo... shaving a yak.
|
||||
|
||||
I love the origins of this phrase; a Ren and Stimpy cartoon where Ren (?) has to shave a yak to get fluff for his pillow so he can get some sleep 😁
|
||||
|
||||
---
|
||||
|
||||
> #### Epistemic Peer
|
||||
>
|
||||
> People who have demonstrated clear cognitive overlap with you. You trust their thinking enough that if they feel strongly about a topic you haven't researched, you're willing to defer to their judgement. If they disagree with you, you take it seriously. Even if you don't change your mind, you consider their viewpoint valid.
|
||||
>
|
||||
> You earn epistemic peerhood by doing your research on topics you choose to write about, presenting compelling evidence, and making creative arguments that help reframe existing debates in more interesting ways.
|
||||
|
||||
I love this concept! It feels like we’re missing it a lot on the web (see [the web’s missing communication faculty](/posts/webs-missing-communication-faculty/)), I fully intend to seek out and recognise people who can be my _epistemic peers_.
|
||||
|
||||
(Though I think declaring who they are might be a security risk?)
|
||||
|
17
content/bookmarks/okay-color-spaces.md
Normal file
17
content/bookmarks/okay-color-spaces.md
Normal file
|
@ -0,0 +1,17 @@
|
|||
---
|
||||
title: Okay, Color Spaces
|
||||
date: "2024-03-03T08:57:00Z"
|
||||
publishDate: "2024-02-20T00:00:00Z"
|
||||
bookmarkOf: https://ericportis.com/posts/2024/okay-color-spaces/
|
||||
references:
|
||||
bookmark:
|
||||
url: https://ericportis.com/posts/2024/okay-color-spaces/
|
||||
type: entry
|
||||
name: Okay, Color Spaces
|
||||
summary: What is a “color space?”
|
||||
author: Eric Portis
|
||||
tags:
|
||||
- color
|
||||
---
|
||||
|
||||
An excellent explanation of colourspaces and their uses today.
|
|
@ -12,6 +12,7 @@ references:
|
|||
the year bring? What hopes and hurts, what majesty and mayhem, what lessons
|
||||
and laments? January holds in the palm of its hands…
|
||||
author: Julie Zhuo
|
||||
tags: []
|
||||
---
|
||||
|
||||
I enjoyed this article’s passion for _being alive_, I think there’s a lot of similarity to a previous article’s [obviousness](/tags/obviousness), and putting ourselves in emotionally challenging positions sometimes.
|
||||
|
|
|
@ -10,6 +10,10 @@ references:
|
|||
name: Breaking the Tyranny of Obviousness
|
||||
summary: We are stuck in a hell of frictionlessness.
|
||||
author: P.E. Moskowitz
|
||||
tags:
|
||||
- curious
|
||||
- grief
|
||||
- Miriscient
|
||||
---
|
||||
|
||||
|
||||
|
|
|
@ -7,9 +7,11 @@ import (
|
|||
"fmt"
|
||||
"io"
|
||||
"net/http"
|
||||
"net/url"
|
||||
"os"
|
||||
"path"
|
||||
"regexp"
|
||||
"slices"
|
||||
"strings"
|
||||
"time"
|
||||
|
||||
|
@ -19,6 +21,12 @@ import (
|
|||
//go:embed query.gql
|
||||
var gql string
|
||||
|
||||
var ignoreLabels = []string{
|
||||
"opinion-agree",
|
||||
"opinion-disagree",
|
||||
"interesting",
|
||||
}
|
||||
|
||||
func main() {
|
||||
apiKey, ok := os.LookupEnv("OMNIVORE_API_KEY")
|
||||
if !ok || len(apiKey) == 0 {
|
||||
|
@ -47,7 +55,6 @@ func main() {
|
|||
var hashtags = regexp.MustCompile(`#\w+`)
|
||||
|
||||
func outputArticle(article Article, outputDir string) error {
|
||||
|
||||
slug := kebab(article.Title)
|
||||
hugoPost, err := os.Create(path.Join(outputDir, fmt.Sprintf("%s.md", slug)))
|
||||
if err != nil {
|
||||
|
@ -55,10 +62,9 @@ func outputArticle(article Article, outputDir string) error {
|
|||
}
|
||||
|
||||
fm := FrontMatter{
|
||||
Title: article.Title,
|
||||
Date: article.BookmarkDate.Format(time.RFC3339),
|
||||
PublishDate: article.PublishDate.Format(time.RFC3339),
|
||||
BookmarkOf: article.OriginalURL,
|
||||
Title: article.Title,
|
||||
Date: article.BookmarkDate.Format(time.RFC3339),
|
||||
BookmarkOf: article.OriginalURL,
|
||||
References: map[string]Ref{
|
||||
"bookmark": {
|
||||
URL: article.OriginalURL,
|
||||
|
@ -68,6 +74,11 @@ func outputArticle(article Article, outputDir string) error {
|
|||
Author: article.OriginalAuthor,
|
||||
},
|
||||
},
|
||||
Tags: article.Tags,
|
||||
}
|
||||
|
||||
if !article.PublishDate.IsZero() {
|
||||
fm.PublishDate = article.PublishDate.Format(time.RFC3339)
|
||||
}
|
||||
|
||||
fmt.Fprintln(hugoPost, "---")
|
||||
|
@ -77,16 +88,20 @@ func outputArticle(article Article, outputDir string) error {
|
|||
}
|
||||
|
||||
fmt.Fprint(hugoPost, "---\n\n")
|
||||
fmt.Fprintln(hugoPost, linkHashtags(article.Annonation))
|
||||
fmt.Fprintln(hugoPost, linkHashtags(article.Annonation, fm.Tags))
|
||||
fmt.Fprintln(hugoPost)
|
||||
fmt.Fprint(hugoPost, "### Highlights\n\n")
|
||||
|
||||
if len(article.Highlights) > 0 {
|
||||
fmt.Fprint(hugoPost, "### Highlights\n\n")
|
||||
}
|
||||
|
||||
for i, highlight := range article.Highlights {
|
||||
noTrailingNewLine := strings.TrimRight(highlight.Quote, "\n ")
|
||||
quote := "> " + strings.ReplaceAll(noTrailingNewLine, "\n", "\n> ")
|
||||
fmt.Fprint(hugoPost, quote+"\n\n")
|
||||
|
||||
if highlight.Comment != "" {
|
||||
fmt.Fprint(hugoPost, linkHashtags(highlight.Comment)+"\n\n")
|
||||
fmt.Fprint(hugoPost, linkHashtags(highlight.Comment, fm.Tags)+"\n\n")
|
||||
}
|
||||
|
||||
if i < len(article.Highlights)-1 {
|
||||
|
@ -97,9 +112,10 @@ func outputArticle(article Article, outputDir string) error {
|
|||
return nil
|
||||
}
|
||||
|
||||
func linkHashtags(text string) string {
|
||||
func linkHashtags(text string, tags []string) string {
|
||||
return hashtags.ReplaceAllStringFunc(text, func(hashtag string) string {
|
||||
return fmt.Sprintf("[%s](/tags/%s)", hashtag[1:], hashtag[1:])
|
||||
tags = append(tags, hashtag[1:])
|
||||
return fmt.Sprintf("[%s](/tags/%s)", hashtag[1:], strings.ToLower(hashtag[1:]))
|
||||
})
|
||||
}
|
||||
|
||||
|
@ -127,6 +143,7 @@ type Article struct {
|
|||
OriginalAuthor string
|
||||
Annonation string
|
||||
Highlights []ArticleHighlight
|
||||
Tags []string
|
||||
}
|
||||
|
||||
type ArticleHighlight struct {
|
||||
|
@ -137,9 +154,10 @@ type ArticleHighlight struct {
|
|||
type FrontMatter struct {
|
||||
Title string
|
||||
Date string
|
||||
PublishDate string `yaml:"publishDate"`
|
||||
PublishDate string `yaml:"publishDate,omitempty"`
|
||||
BookmarkOf string `yaml:"bookmarkOf"`
|
||||
References map[string]Ref
|
||||
Tags []string
|
||||
}
|
||||
|
||||
type Ref struct {
|
||||
|
@ -175,7 +193,7 @@ type SearchResult struct {
|
|||
Highlights []Highlight
|
||||
Labels []struct {
|
||||
Name string `json:"name"`
|
||||
}
|
||||
} `json:"labels"`
|
||||
}
|
||||
|
||||
type Highlight struct {
|
||||
|
@ -256,6 +274,8 @@ func parseResponse(body []byte) ([]Article, string, error) {
|
|||
for _, edge := range searchResults.Data.Search.Edges {
|
||||
sr := edge.Node
|
||||
|
||||
articleURL := stripMarketing(sr.OriginalArticleURL)
|
||||
|
||||
var highlights []ArticleHighlight
|
||||
var annotation string
|
||||
for _, highlight := range sr.Highlights {
|
||||
|
@ -270,6 +290,7 @@ func parseResponse(body []byte) ([]Article, string, error) {
|
|||
}
|
||||
|
||||
if len(annotation) == 0 {
|
||||
fmt.Fprintf(os.Stderr, "No annotation for %s\n", articleURL)
|
||||
continue
|
||||
}
|
||||
|
||||
|
@ -280,8 +301,7 @@ func parseResponse(body []byte) ([]Article, string, error) {
|
|||
}
|
||||
published, err := time.Parse(time.RFC3339, sr.PublishedAt)
|
||||
if err != nil {
|
||||
fmt.Fprintf(os.Stderr, "Failed to parse PublishedAt date: %s\n", sr.ID)
|
||||
continue
|
||||
fmt.Fprintf(os.Stderr, "Failed to parse PublishedAt date (for %s): %s\n", articleURL, sr.ID)
|
||||
}
|
||||
|
||||
abbreviatedOriginalTitle := sr.Title
|
||||
|
@ -304,7 +324,7 @@ func parseResponse(body []byte) ([]Article, string, error) {
|
|||
ID: sr.ID,
|
||||
Title: title,
|
||||
OriginalTitle: abbreviatedOriginalTitle,
|
||||
OriginalURL: sr.OriginalArticleURL,
|
||||
OriginalURL: articleURL,
|
||||
OriginalAuthor: sr.Author,
|
||||
OriginalSummary: sr.Description,
|
||||
BookmarkDate: bookmarked,
|
||||
|
@ -313,6 +333,13 @@ func parseResponse(body []byte) ([]Article, string, error) {
|
|||
Annonation: annotation,
|
||||
}
|
||||
|
||||
for _, label := range sr.Labels {
|
||||
if slices.Contains(ignoreLabels, label.Name) {
|
||||
continue
|
||||
}
|
||||
article.Tags = append(article.Tags, label.Name)
|
||||
}
|
||||
|
||||
articles = append(articles, article)
|
||||
}
|
||||
|
||||
|
@ -323,3 +350,22 @@ func parseResponse(body []byte) ([]Article, string, error) {
|
|||
|
||||
return articles, cursor, nil
|
||||
}
|
||||
|
||||
func stripMarketing(rawURL string) string {
|
||||
u, err := url.Parse(rawURL)
|
||||
if err != nil {
|
||||
fmt.Fprintf(os.Stderr, "Failed to parse URL: %s\n", rawURL)
|
||||
return rawURL
|
||||
}
|
||||
|
||||
q := u.Query()
|
||||
q.Del("amp")
|
||||
q.Del("utm_source")
|
||||
q.Del("utm_medium")
|
||||
q.Del("utm_campaign")
|
||||
q.Del("utm_content")
|
||||
q.Del("utm_term")
|
||||
u.RawQuery = q.Encode()
|
||||
|
||||
return u.String()
|
||||
}
|
||||
|
|
Loading…
Reference in a new issue