illospec drawing system 

This is my project to do various kinds of automatic illustrations, I forget if I've made threads about this before.

The key concepts driving it are: subdivision, interpolation, extrusion

I've been iterating on a few ways of expressing these things and hit on the idea of focusing on grid and coordinate systems. Cartesian and polar systems are really common: what if I slapped matrix transforms on them and defined interpolations?

A key insight I've made to that end today is that I can dump all the concepts into the same object as a point structure, and then linearly interpolate along all of the dimensions at once(but usually holding most of them fixed) to get a resulting curvilinear grid with arbitrary ratios and weird bendy tiling patterns.

Then the final image can be defined as an expression of those interpolations across multiple coordinate spaces, as opposed to the usual vector/raster thing of defining it all point-by-point in Cartesian space.

Further, I can turn an interpolation on any axis into a subdivision by defining a sequence or iterative extrusion of ratios along it.

vaccine 

Both parents have had their first dose now. Each time they discuss all their feelings about it, bad and good, I quote the SFGov web site's declaration: "The vaccine is safe, and very effective."

a theory i have around how the crypto markets actually work 

It's "Worse Is Better" monopolists butting heads with speculators on "Right Things".

If you submit that all existing systems are "Worse is Better" in some degree and have the adoption they do through inertia and gatekeeping, then Bitcoin is the original Worst Coin: it only has one major feature and it's inefficient, but it came first, thus it has the number 1 market cap. Ethereum is number 2 because it's the second Worst Coin, it's programmable in a highly insecure and also inefficient way.

Now, normally, what happens is that it ends there, the worst things get adopted and monopolistic dynamics take hold until a disruption occurs.

But this is a space where speculation remains forever open. Speculators are the exciting part of capitalism because they build markets out of thin air; most of the things they put their money into fail or are actual scams, but the leftovers turn into Something.

Speculation in crypto has thus far proven to be gatekeeping-resistant. There's no barrier to making a new chain or copying old ones, so control of the distribution in one chain doesn't translate into control over finance. When things get bad, speculators just migrate to something with "better features".

green cryptocurrencies 

i am testing out chia, this is one from bram cohen(bittorrent author) - still in beta, but it's one of the most fleshed out and well-rounded new projects i've seen in ages. uses hard disk space as the proving function. i'll have to dig out a drive and see how that goes

chia.net/

In the last few months birdsite has become conspicuously full of people discussing who is in their clubhouse as if they are all 11 years old

re: Notes on NFT/cryptoart 

Further thoughts since it's continued to occupy me.

The thing that's different here is in the speculative aspect. Ordinarily there is gatekeeping friction that cuts off speculation - it stops at the point where someone says "well, where are my exploitation rights" because the final pricing mechanism comes from exploiting the property to trade a material good at some markup, which creates hard linkage to state-industrial mechanisms, IP law, etc. Most people cannot exercise these mechanisms, thus gatekeeping sets in.

Here, the speculative network is scale-free. The best buyers of an idea become speculators who see a way to contribute a bit of value and piggyback their own venture off this one, which means that "perennial" NFTs will tend to pick up more and more value as they gain influence even with zero material aspects to them. I am 100% confident of this because proof of stake coins themselves have successfully appreciated in value(despite being called scams every two seconds), and they have this same intrinsic lack of material, just ideas.

The stage we're in right now is simply highway robbery of crypto enthusiasts who cannot spot perennials and believe in their material rights. Hence they want NFTs that burn energy and massage their ego.

Show thread

Notes on NFT/cryptoart 

There's been plenty of hype and smoke around this space lately, here are my takes:

1. There's rapidly growing awareness of the ecological aspects of securing the chain. The majority of NFT activity is on ETH right now; ETH itself is moving towards proof of stake which is tremendously more sustainable. Even within the existing system there are already several rivals to ETH for proof of stake tokens. That doesn't take back the waste that's already been made and will be made in the near term, but I suspect we will see a satisfying resolution within 18 months.

2. It is undeniable that it's speculative and prone to a market crash, but going forward, NFT is one more option piling onto the last decade of new options - Kickstarters and Patreons and streaming and all of that.

3. Nobody(including the market participants) fully understands what it means yet. NFT sits at the intersection of "what's the value of art" and "what's the value of money". Good question.

4. This is an exciting space to watch, don't cut yourself off from it!

It's interesting to note that the Kleene star("match zero or more repetitions of the character") is defined as fewest-matching which means you have to backtrack n times before you consume n repetitions, but then regex implementations in many programming languages default to greedy matching(consume all the repetitions) for their star operator.

In practice it's way more intuitive for me to reason about greedy matching to describe a pattern. I think the reason Kleene defined it the other way may have been because he was focused on describing "the set of all..." which would be excluded by a greedy operator.

I've never really gotten over how well constructed the Killers song "The Man" is, both the song and its video.

youtube.com/watch?v=w3xcybdis1

Everyone knows this man, or perhaps has been him at some point, and we see him humbled and hated, but never truly defeated, not because his passion is meaningful, but because he doesn't lose. "Nuh-nuthin can break me down," he stammers unconvincingly, recovering his composure just in time for the chorus.

I am learning about Thompson NFA construction for fast regular expressions and something I found myself saying in figuring it out was, "ah, there's the flip-flop!"

Because with pretty much every complex parsing/compilation idea I've ever dealt with, there is a binary flip-flop that occurs to describe two "states of parsing", "layers" or "levels" that do different tasks. A complex encoding gradually becomes restated as an expression of the two states and the points at which they "flip".

In the case of the NFA construction it's done through the distinction of regular symbols(token literals) and epsilon symbols("null" tokens). This is a formalism that can be swapped out with recursion in many instances.

re: Taffy file format 

OK it is pretty straightforward to code up the subset of RE that I would actually want to use.

So I have basically barfed out a huge amount of code and will try to finish and test it tomorrow. One thing I am modestly dissatisfied with is the slight irregularity of page/block/division. Also, I haven't actually modelled the inheritance yet and that makes the match functions a little more complex.

TBH it seems like it might be too easy to make the integrity check take a very long time. But it's also extremely comprehensive and something that could be bypassed for speed.

Show thread

Taffy file format 

With the constraints in place it becomes much harder to lose data integrity by e.g. pointing to the wrong block.

And because it operates on blocks and not bytes it's a lot more fungible. The assumption made is that if you're trying to use indirection, it's for a structured type. And if you want to describe variable length sequences, you could either match on a sequence of blocks or pages (null terminator style) or describe start and end as two references.

All of these qualities make it so that I can write a generic editor for Taffy and have it make sense to use on every level - it lets you program the schema, edit the values, and validate.

Show thread

Taffy file format 

Made a move to completely rethink how this works.

Old: Nested sequences of typed bytes
New: Fixed size pages and blocks that contain sequences of typed bytes

Nesting is an awful quality for serialization because it makes the entire decoding process stateful, so I eliminated it which means that now it would be feasible to edit Taffy files with a hex editor.

Division types work as before, they are as close as one gets to native values. But you can declare inheritance on a division type as before to indicate more information.

The blocks are 256 bytes each and contain a fixed encoding of division type values according to the block type, so they will map almost directly to a struct or array of dynamic.

The pages are descriptors of the block types and contain, ah, 127 blocks each I think(256 bytes, 2 bytes per block minus a 2 byte type ID for the page)

I added a block reference as a division type: 3 bytes for page and 1 byte for block. And the nifty thing about this is that the reference is typed by the block, which motivated writing a programmable constraint system.

Knowing that divisions, blocks and pages all have an integer type ID, the constraint operates as a set of match rules on type sequences - WIP on ruleset.

I should probably describe it as a regular expression.

Show thread

anime 

Vlad Love just dropped and this is a good one, Mamoru Oshii project and it shows a pretty sophisticated visual approach to some very tropey material. Live action footage of the Tokyo skyline, VN-style picture-in-picture over stylized backgrounds for talking, blended with occasional cuts to full animation. Plus the writing constantly pokes fun at its own deus ex machina.

The staging gives it the feel of a live performance, and this makes the urban fantasy elements uncomfortable in a good way.

Illospec illustration tool 

Working title of my "tracker music but graphics" project - I did a self talk session while out in the rain developing this idea, which had been a bit muddled up ti this point.

Realized that I can conceptualize the thing I was doing with monophonic music alongside this; describe everything in terms of directional vectors that are subdivided into a ratio. Then use the ratio to structure child vectors. That's not really new, guides have been used this way since forever, but where it becomes tracker-like is to list out a characterization for each division point and interpolated segment:

* How to interpolate the points (linear, splines, bezier handles)
* Patterns, colors, stroking rules per segment(variable line width)
* Fills defined by addressing another line segment's subdivisions
* Create presets and reuse them to put intricate details in subdivisions

There is some stuff to figure with how the dependencies are edited and how interpolation is precisely defined(perhaps an envelope like on digital synths), but it makes sense as a building block.

Taffy file format 

Resuming thoughts on this. There is some kind of handshake between the user code and the protocol going on here where the user will say, "I want to make this kind of query" and Taffy may say "OK, I need to prompt you here" but the types of allowable queries are baked into the sequence of types.

The stack idea shows some merit, I think. What if they were stacks with no division values, just identifiers? 32 bit address space of stacks, which can push and pop 32 bit ints. The appearance of a type would signal a stack effect, it could push or pop from one or more stacks. And then the usercode would have a basis for the literal effect by monitoring those stack changes and mapping them into the division stream, motivating the full decoding.

That is, the writing of a Taffy decoder would be of this form:

1. Write callbacks for each declared stack effect(which is associated to type)
2. Fill in the decoding of the division primitives that appear in each stack effect

And so there isn't any need to grok the thing at more than a local level; that was done when the schema itself was made. The stacks can be mapped literally or bypassed for convenience, but there is an automatic validation taking place regardless by the "no unbalanced stacks" rule.

Show thread
Show older
The Vulpine Club

The Vulpine Club is a friendly and welcoming community of foxes and their associates, friends, and fans! =^^=