2385 posts • joined 8 Nov 2007
I don't know if this has any basis in fact (maybe with superconductors?), but one way that might explain differences between our models of the Universe and observations might be some sort of local dimension lowering in certain systems. Absent singularities, we assume that the stuff of the Universe is smoothly differentiable and works in a field with a fixed, integer dimension number. But what if, when the curvature gradient of the mathematical space exceeds a critical threshold, the smoothness of that part of space breaks down and is replaced by a lower-dimensional lattice plus some effects that can be measured relative to that lattice's natural coordinate system? This would have to happen near a singularity, so perhaps it could be a basis for explaining how cosmic censorship works (nature abhorring naked singularities).
I'm not sure if, mathematically speaking, you can have regular lattices (or non-regular ones like Penrose tilings) embedded in fractal dimensions, but if so, this might offer an alternative place to look for models beyond those that break down by reducing the number of dimensions by an integer. Or perhaps there's an orderly transition process by which the number of dimensions is reduced by a whole number through an intermediate set of fractal-space tilings. Or maybe there's no need for fractal dimensions. If the singularity itself has characteristics of an aperiodic tiling, the breakdown of the "tiling space" of dimension n-1 could be a factor of the existing physical factors that are observable from outside and the size of the tiling space. So this exotic "ice-9" kind of space wouldn't be able to crystallise/convert normal space beyond a critical threshold.
I think that dimension reduction/crunching does seem to take place near black holes, since we appear to go from a 4d field to a system with fewer dimensions (eg, the "surface" of a black hole seems have one less dimension), but the problem which this explanation for more mundane systems is that we don't see singularities everywhere. So unless a kind of spontaneous "crystallisation" of pockets of space happens more commonly than we thing, then it's probably not a good explanation of dark energy/matter. However, maybe this is just an artefact of how we think about coordinate systems and instead of renormalisation in quantum field theory, we should maybe be treating infinities there as geometric objects, such as aperiodic tilings in one dimension lower than the main system, that can interact and evolve with each other in ways that provide an alternative mathematical understanding of the observed facts.
I know that this sounds quite crazy, but at least as an analogy, the idea of crystalline/lattice states isn't too far-fetched, and at least I can propose a concrete mechanism (some gradient metric exceeding a critical threshold) for normal states to spontaneously convert into the exotic state.
This whole saga seems ripe for a musical rendition on stage or film...
Thick as a Brick/Aqualung (Jethro Tull)
OK Computer/No Surprises/The Bends (Radiohead)
The Lucky One (Paddy "PADI" Casey)
Broken Household Appliance National Forest (Grandaddy, "The Sophtware Slump")
All Apologies (Nirvana) / The Apologist (R.E.M, "Up")
ground crew behind the Chinese launch of that micro drill payload in that secret satellite launch. Why? Because they are obviously missing out on the cultural subtext in the exchange they could have had as they enacted their mission. To wit:
This is not a drill!
This is a drill!
Or also with a specific Unicode point. For example, 平成 has its own "hangaku" (half-width) rendering that squashes the two characters together: ㍻
I read somewhere (NHK News?) that even getting the new character into the official Unicode list is not trivial and it may not even be possible to complete until a while after. And then, all those OS updates for all those devices out there...
The most pragmatic solution (for programmers and the like) is to just pretend that it's still the 平成 era for a while, then go through a period where these non-existent dates coexist with implementations of the new naming. Quite a lot of work and scope for problems, but not quite as bad as a Y2k-style counter overflow problem.
As a Perl programmer, I've been using YAML since .. forever, it seems. It's "interesting" (for low values, and all) that it's suddenly trendy.
I don't think it's a good language for user-editable config files, though, unless they're quite simple. For a start, I don't think users should be forced to look up syntax rules. Also, I like a config file format that allows comments so that you (or another user) can use to understand what's going on. YAML can have embedded comments, but if you slurp the file and then regenerate a new copy, those comments get lost. Compare this with simple key-value files that you can "source" in bash/sh (so regenerating or updating is just a case of changing specific KEY= lines, and passing everything else, including comments, through) and it's just needlessly adding complexity.
Where YAML shines for me, though, is when it comes to working with complicated data structures. A typical programming task for me is to collect data from some mix of sources (eg, web pages) and pull out salient information for later processing. If I was doing this in a project, my manager would probably ask me to document a schema, create some databases and so on. However, usually I don't know in advance what these schemas might be, so my scripts just evolve, adding new data fields and even entirely new structures as I go. I often add "pointers" (cross-references) from one structure to another. At the end of if, I can just dump all of these structures into a single YAML file.
Later on, I can write a second script that reads that YAML file and creates something more refined out of it, eg, turning it into a proper database, with referential integrity and all that stuff. Or, this being Perl, I can keep the YAML file as a first-class data/object storage format, even embedding it into a library file if I want to. If I ever decide that I want to switch languages, eg writing a C or Python application, I can either use the YAML import features of that language or write a simple code generator to, eg, output a set of literal C structs or whatever.
In summary, YAML is good for sloppy/fast development cycles with quite complex, loosely-defined data schemas, but if it becomes important to impose more structure (eg populating/updating a database with a more rigid schema, or embedding it into some other bit of code), then YAML is still a good stepping stone. Less end user, more rapid development aid.
3 unlikely as there is a standard dictionary ordering based on phonetic spelling (A Kat Sat Thinking oN How Many Yakult (are) Rancid for one possible mnemonic)
I'm still a bit concerned about how many textbooks are still out there telling you that you have to learn the word 電報（"telegram"）
old Japanese man ranting about the national public broadcaster (NHK, like BBC) over loan-words entering the language. I did read an article pointing out some good arguments, such as that many of the "native" words he suggested are in fact imports from China, but I can't find it again.
String theory in particular seems to be a case of "throw it at the wall and see what sticks". (same for dark matter, but I'm not going to root out the wine-dark honeyed centres here—it's a load of Bologne)
(oh, and yea, it was prophesied and so his noodly appendages came to manifest and such)
Where's my hat?
Also... "reached out", "going forward", "1/200 or less than", "a mistake in how we communicated with our customer about the terms of its plan".
Only one way to deal with these: kill them with fire. It's the only way to be sure.
Biting the hand that feeds IT © 1998–2019