What's this all about?
I've started this project in an effort to consider more deeply the things I learn throughout my days spent as a knowledge worker. Along the way, I hope to capture some sublime vibes
that come with finding novel solutions to tough problems.
This is foremost a celebration of discovery in a casual and granular format. The most recent entries are at the top of the list, and each entry is separated by a yellow marker. It's like a blog-within-a-blog!
If you have any questions or would like to share your own discoveries, please reach out on Twitter. Thanks for stopping by!
Using conditional includes within .gitconfig
files is possible! Out of the box, Git v2.13+ allows you to specify directives like:
[includeIf "gitdir:~/work/"]
path = "~/.gitconfig-work"
Which will include the referenced config file only when the Git repo you're working on is in a subdirectory of ~/work/. This is essentially direnv
but with fewer steps! Love it.
Nice dark mode theme for qBitTorrent. For, uh, when you're downloading your favorite open source libraries of course!
https://github.com/jagannatharjun/qbt-theme/releases/
Use quicktype.io for JSON data -> auto-generates TypeScript types. Handles diverse input/output combos, free & open-source. Pretty neat!
Quick rust
installation on Windows 10/11:
winget install Rustlang.Rust.MSVC
wefwef is a streamlined and no-nonsense PWA for Lemmy users. Spending time on lemmy.world or beehaw.org these days? Give it a shot, it does a great job.
elk.zone has won my heart over. It's an absolutely wonderful Mastodon client and an inspiring open-source Nuxt application showcasing some real expertise in Vue.js application development.
Asahi Linux aims to bring you a polished Linux® experience on Apple Silicon Macs. Linux on Apple Silicon.
Payload is an interesting Node & React based headless CMS solution, and is one of the few in the space that is open source. Whereas many equivelant products are monetized aggressively, Payload rolled out a very welcomed MIT licensing model in May of 2022.
Take a look at the documentation, dust off the ol' Digital Ocean droplets, and give it a spin! I know I'll be trying it out soon.
Let's talk about "velocity", crunch, and old habits. There is a particular habit that inspires this note, which is a tendency to:
Sprint ahead as work begins. Use time to gather insights. Sometimes prototype solutions unprompted. Amass many questions related to unknown quantities - often referred to as "opens."
I've developed this habit by working in environments where planning, execution, and testing are all delegated to me as a "solo developer."
The term Technical Recon seems to fit best when describing this approach. In the past, this technique has helped me to build trust with non-technical stakeholders. I have also used it to encourage consideration of systems and edge-cases earlier. This has generally enhanced the fidelity of development estimates around effort.
And yet, Technical Recon can cause friction when applied out of habit. Particularly when the stakeholders for a project are technical experts themselves. You run the risk of shifting focus to non-issues which you may not comprehend. It can also be disruptive to production efforts.
Over the years, I've become more hesitant to apply Technical Recon during projects. My reasoning follows:
It can be stressful. The phrase "save the gas for when you need it" seems pertinent. Pushing toward complexity is sometimes necessary, but not always. For example, Technical Recon can be very useful when estimating high risk implementations.
It can derail personal focus and leave known quantities neglected. Over-application of Technical Recon has led to less-than-robust implementations of simpler systems. With attention spread too thin, more complex code becomes a "fan favorite". Known quantities become an afterthought. This is sometimes necessary - when timelines are hyper-aggressive - but it runs contrary to my love of robust software engineering.
It can be disruptive to the greater team. Asking team leads increasingly obscure questions. Proposing esoteric problem solving during focused meetings with teammates. It feels more healthy - both socially and technically - to have a shared set of immediate problems to solve. An insular set of long-term "maybe problems" are less suited to team discussions.
The remedy for these types of issues, strangely enough, is to do less and think less. Space can still be reserved for Technical Recon - even as projects are in mid-production - but they can become team efforts. Once complete, the focus can shift back to immediate problems.
Having issues with asset URL transformations not occurring within custom component props when using Vite ~2.9
and Vue ~3
? You can solve this by passing additional plugin configuration to template.transformAssetUrls
inside your vite.config.[js|ts]
file.
import { resolve } from 'path'
import { defineConfig } from 'vite'
import vuePlugin from '@vitejs/plugin-vue'
const vueConfig = {
template: {
transformAssetUrls: {
tags: {
// Custom components
MyImage: ['src'],
// Note that we include default transformAssetUrls; sadly there isn't
// an easy way to merge into these without redefining them explicitly.
video: ['src', 'poster'],
source: ['src'],
img: ['src'],
image: ['xlink:href', 'href'],
use: ['xlink:href', 'href'],
},
},
},
}
export default defineConfig({
resolve: {
alias: {
'@': resolve(__dirname, './src'),
},
},
plugins: [
vuePlugin(vueConfig),
],
})
In this example, assume we have a custom component called <MyImage>
and we want any src
props to receive asset URL transformations. By default only a handful of tags and attributes receive this pre-processing, such as img[src]
. This additional configuration we've added ensures that asset paths (and aliases!) are properly processed by Vite.
Now we can use <MyImage src="@/assets/images/logo.png" />
to properly reference a file located at src/assets/images/logo.png
without any unexpected 404s, and to gain benefits like automatic-inlining of small assets, cache busting, etc.
Orbital Market is a third-party search and filtering tool for the Unreal Engine Marketplace, and offers a wildly improved user experience when browsing content. Highly recommended!
Fresh is a new full-stack application framework for the Deno runtime. Looks like a nice evolution of the Deno ecosystem, which has previously seemed more suited to one-off uses in contexts such as compute-as-a-service / AWS Lambda.
TensorWorks is creating an Unreal Engine powered pixel streaming offering called Scalable Pixel Streaming. A part of their efforts include an open source WebRTC player frontend that has been built as a drop-in replacement for the Epic-provided pixel streaming samples - see UE source access on GitHub.
This is wonderful news for a few reasons:
- The core, Epic-provided pixel streaming samples are of proof-of-concept quality, and not fit for production use.
- The complexity of the Epic-provided proof-of-concept is extremely high, and the organization / architecture of the solution very low.
- The Epic-provided proof-of-concept makes gratuitous use of
window
globals, which makes integration with a larger app fragile and error prone. - Because of the above, it is very difficult to modify the code with any amount of confidence.
I'll be working on a minimal implementation using TensorWorks' new frontend library this week. Hopefully I'll have more details to share soon ⚡
Working with Vite, TypeScript, and Vue 3 in Visual Studio Code is a pretty great user experience. One of the things that I needed to tackle early on was understanding how to get custom import aliases working smoothly, ideally with auto-complete support. Luckily, I was able to figure this one out without too much headache 🙌
Taking care of the basics:
- Ensure the Volar VSCode extension is installed and is setup in takeover mode
- Add custom aliases in
tsconfig.json
as well asvite.config.ts
. More details on this below - Add
"baseUrl": "."
to"compilerOptions"
intsconfig.json
.- This one is a bit odd, and I'm not sure why it's required. However, without this option, auto-complete doesn't seem to run properly. Cheers to zigomir on GitHub for posting this solution.
Setting up custom aliases:
I added a couple of aliases to both tsconfig.json
and vite.config.ts
to get easy access to common paths within my imports.
My tsconfig.json
looks like this:
{
"compilerOptions": {
// ...
"baseUrl": ".",
"paths": {
"@/*": ["./src/*"],
"~/*": ["./*"]
}
// ...
},
// ...
}
And vite.config.ts
looks like this:
import { defineConfig } from 'vite'
import vue from '@vitejs/plugin-vue'
import { resolve } from 'path'
export default defineConfig({
resolve: {
alias: {
'@': resolve(__dirname, './src'),
'~': resolve(__dirname, './'),
},
},
plugins: [vue()],
})
This gives me quick access to the project root via ~/
and the source root via @/
. Along with the baseUrl
option in tsconfig.json
and the excellent Volar
extension in VS Code, everything seems to be working great.
Hope this helps you in your IDE configuration journey 😊
The wild and wacky edge cases of Character Counters on the web.
Mike Royal maintains an amazing set of Unreal resources on GitHub. Check it out and marvel at this absolute chonk of a resource list.
I'm finding it difficult to create expressive documentation with only Markdown in GitLab, despite their numerous GitLab Flavored Markdown extensions.
After spending years writing documentation and communicating feature proposals with Notion, raw Markdown seems to lack a certain flexibility that I've come to appreciate.
I may begin looking into interop solutions w/ the Notion API, or perhaps look into other extensibility options available to a custom GitLab instance.
Universal Scene Description, or USD
, is an interchange format that can encode 3D scenes for transfer or representation in other programs. It can be compressed to binary, or placed into a human-readable USDA
format.
I like to think of it as "SVG but for 3D" in that it can represent anything from primitives like Spheres or Cubes, to complex geometry at the vertex level. USD can also store lighting information in unique descriptors like UsdLux, which can later be used by an external rendering tool.
I'm reminded of features like <ellipse>
, <path>
, and <clipPath>
in SVG, where abstract data is used by an external rendering engine (typically a browser) to recreate a graphical representation at nearly any scale.
The format was built with goals like minimal memory-footprint and general flexibility in mind. The core USD documentation is wonderfully detailed, and doesn't hesitate to make the flexibility of the technology very clear:
[...] the primary directive of USD: scalable interchange of geometric, shading, and lighting data between DCC’s (e.g. Maya, Blender, 3DSMax) in a 3D content creation pipeline.
And lastly, this format is not without competition. glTF is a similar offering from the group that brought us Vulkan, WebGL, OpenGL, and other technologies.
Interchange formats... FIGHT!
Or maybe not? Despite overlap in concerns, there are users who find them to be complimentary formats. Below is an excerpt from the discussion following a Blender release on Hacker News.
arminiusreturns asks:
Do you happen to have any thoughts on the relationship of gltf to USD? Are they complimentary or competitive?
yki writes:
I think they are very much complimentary. This vastly oversimplifies things, but you can can almost think of gltf is to USD like PNG is to a Photoshop file.
gltf is meant to be a 3D publishing format, whereas USD is meant to be an asset interchange format between DCC applications. gltf is very minimal in the amount of runtime processing required to interpret and display it (much of gltf can more or less be directly mapped into GL buffers), whereas USD includes an extremely powerful schema and composition system that is highly expressive but much more complex to interpret in its fullest form (meant for sending entire editable scenegraphs between, say, Maya and Houdini and Blender and Presto).
Both formats are really great at what they’re meant to do.
dagmx writes:
Both complementary and competitive.
USD is a much more complex beast with powerful composition workflows that allows for meeting the needs of large productions. Gltf is more of a distribution format at the end of a production like this. So in this regard they're complementary.
USD can also be extended to read other formats like alembic and obj and present it as if they were USD data at runtime. A gltf plugin could be made for USD to allow gltf to act as a leaf node.
There is also a packaged variant of the USD formats called usdz which is what Apple uses for their AR ecosystem. This can be seen as competitive to gltf since they are both now distribution formats and not production formats first.
A quick hack (yes it's a hack) to increase CSS selector specificity is to double up on class names. I've used this so many times where an !important
would be the only other option, e.g.
.someClass {
/* Perhaps defined elsewhere in 3rd party code, a base set of styles */
color: blue;
}
.someClass.someClass {
/* Our override code, note the doubled-up .someClass selectors */
/* Technically, you can add even more to further increase specificity */
color: red;
}
Unreal Directive brings a ton of highly polished articles, videos, and tips to everyone for... uh, for free?! Great knowledge here, made better by attention to detail + presentation.
ben.ui is absolutely packed with UI implementation knowledge from an Unreal Engine expert. Highly recommended.
Unreal Containers exists, with tons of resources for sysadmins looking to get Unreal Engine instances running in the cloud.
There is a neat npm package called gitignore - written by GitHub - which lets you quickly bootstrap .gitignore
files in your current directory. Try it out with npx gitignore [techstack]
, e.g. npx gitignore node
for a Node-specific set of ignores.
A Parallelepiped is a 6-sided volume, essentially a 3D parallelogram or a sheared cube.
HPC stands for High Performance Computing, aka the development and use of Super Computers for processing and simulation tasks.
High-fidelity simulations used to model workflows for automative factories can require and/or produce more than 20 petabytes of data.
GitLab does a great job of organizing the penultimate "repository" into higher-level units such as Projects and Groups. Issues and even Wikis can be created at levels higher than a single repository, which somewhat impedes discoverability, but makes for much clearer structure in the long term.