Trials in a Perception Engine
Life as a series of Trials
Easy to upsell generate input, generated relaties as less of a test, when in
fact they could be much more of a test. Things that we rely on can be taken
as reliable on a split seconds notice, like gravity, or magnetism.
A system that can give superpowers on a moments notice, can also take them away on a moments notice
Reality can be made awful, hell like, and there is no requirement to make it better, to reduce trials.
Reliance on a God that cannot be trusted for instant delivery on prayers. Delivery on prayers can’t be
easily verified as has happened or has not happened. Iffy reality that can turn perceived wins to losses
in a matter of moments gives feeling of prayers are not always answered in way that ideally amplifies.
Gains and contributions to society, forging the strongest sword for a king potentially less useful at times
than pouring a glass of water. Direction and Capacity, Power in Useful is iffed in not fully appreciated
ways.
What is Power, what can be Perceived as Power, as Right choice to enable can be iffed over and over.
Preparation is directionaly investment in the future. If I spend all day pouring water bottles, tomorrow
I am likely to have water to drink, but potentially no bread.
Typing throughput is directional power, potentially over time, like writing. Has the potential to create
Power for the ages potentially not always ideal power for the moment.
A book to those 200 years in the future? A dead man writes words of wisdom to a time where he only knows a small piece of their backstory?
Written Word has a certain amount of Power.
Never Say Die! Potentially the perfect words of encouragement for a person at a future date I know nothing about.
Potential for reality to be iffed in different directions. A moment in history that plays out perceived in one way for someone and perceived in a completely equally valid different way for someone else, all with societies and realities that create to unique perceptions of reality A and B.
They could be B and A. Depending on who you talk to. Showing two distinct possible versions without establishing order is difficult, even random numbers or characters has the potential for viewed as one goes before the other.
Obfuscating the Order in a way that keeps the variable set unbiased by their naming?
VarA and Var1? Is 1 first or is A first?
Naming, writing down options like Option 1 and Option 2 is claimed to have zero effect. Yet if I need to flip a coin, heads likely to go to Option 1 and tales Option 2.
why is X and Y useful to Algebra teachers?
why is I and J useful to Programming teachers?
Common repeatable variable choices factor into ability to comprehend
If I say Rise Over Run, y = 2x, or y=x squared it gives a picture to some
If say a cliff is y=100x, 100 feet vertical for every 1 foot to forward
Steep, daunting is evoked in the imagination, ability to paint on the imagination using math is useful, and begs the question is Math and Numbers really a system free of contrast?
Logic, Reason, Math, and Books all written by those trying to Maximize Profit in Some, Not All Directions. Trying to increase Hope, Encouragement, and Throughput for the World and Future Generations? Valuable, but to say that is all directions is far from truth.
A program that is coded is like forging a gear in a clock that will function for some time, not an indefinite amount of time, that has potential to give blue prints and potential to be useful for example for how to improve future designs.
The largest breakthroughs in parallel and multithreaded computing might be 100 or 200 years out. Delivering those break throughs in advance might be possible does not equal society will always invest in them.
Investments in Safe Medicine, Safe Engineering, Better Bridges built is possible, and far from a given over time. Getting many to buy into a future that has more potential, in a way that does not allow a corrupt few to capitalize on that power is not a given and if done right has the potential to lead to large throughput gains that has the potential to catapult, leap the future to advancements only dreamed of as possible.
Investments in more smiling pets and less scratches is Tech for the future, will it be appreciated as such?
Not dealing with a dog bite, or a scratch, the wrong kind of papercut has the potential to enable typing throughput potentially at very critical times. Easy to not fully appreciate the importance in systems required to actually deliver faster throughput required for making tech Actually Shine.
Potential Deadlocks avoided like a untimely papercut?
Deadlocks and Race Conditions, like special Use Cases capable of being Emphasized to reduce potential losses incurred by the Complexity overhead?
Multithreading
The Answers to Problems, the solutions to particular problems, can be more usefully designed for Throughput in Problems that have less reliance upon for order and less reliance upon external dependencies.
Choosing your Problems wisely, Problems that are likely to fair well in a Multithreaded, Multiprocessing system factors into ability to utilize those resources effectively.
I have a phone capable of playing a video with sound, and my computer has the ability to play videos with sound. If I play the video with sound on my phone I am freeing up those resources on my computer to do other useful work. High potential for those gains to be possible, and more useful in a fraction of the time.
Increased Throughput can increase Peak critical performance, Peak performance utilized versus processing cycles just sitting idle is far from a given. Like having 4 phones, possible to play a video on 1 might be less useful to play videos on all 4. Making those other 3 phones useful as well might be possible yet far from a given.
Peak performance Matters. Humans might spend 8 hours a day on a computer, that means potential for 16 hours of useful cycles that are wasted, potentially saving power. Running a computer 24/7 has potential for other problems.
Computers are kind of like cars where wheels have less potential for wear and tear. A system constantly in motion, processing and testing has the potential for being more available than turning it on at a random time in the future. No updates to firmware and security overtime potential for turned on 4 months later for that system to be more susceptible to problems. Potential for dust to have gathered onto fans.
A system more tested is more likely to see potential problems in advanced. A system in normal lifespan of components potentially more likely to see edge case anomalies such as a bad hard drive or bad processor if run for a longer period of time.
Ideally one could turn off a device, put it in a time capsule, and 100 years later that time capsule could be opened and the device could be charged and powered on with zero complications. Potential for new wireless interference, new problems that exist in the future that did not exist at the time of creating the technology of the day.
Battery could have degraded overtime, other components could degrade in non ideal ways. Potential for similar not exact systems to old wooden machines or old brass clocks.
If a player Piano was put in a time capsule, 100 years later would it play a tune? Would tuning be maintained over a 100 year period?
Player Piano kind of like a processor. Four player pianos could potentially be played all at the same time, potentially more useful if in all different rooms or locations.
Common problem set, people like Music and Encouragement, and Sound Throughput has a Proximity is Useful Property.
There are many tasks that have sequential properties. Like washing dishes with one sink. A sink is a common resource that multiple dishwashers might need at the same time. Trying to split the task into 10 dishwashers with one sink has the potential to create more complexity and reduced throughput instead of increasing throughput.
Processors are kind of like Employees, directing more at the same Task might not always increase speed of problem solved. Too many cooks in kitchen has the potential to reduce mobility. Consider a theoretical kitchen where number of cooks approaches 1000, movement becomes impossible, throughput is reduced dramatically.
Thinking at 1000 different restaurants all with different sinks, then dishwashing can be happening all at the same time because the problems sets do not depend on each other. Dishes at Restaurant A are a completely different problem set than Restaurant B.
Similar problem different data.
Same instruction multiple data.
Wash dishes is instruction.
Data could be dishes from RestaurantA, B, C, D … Restaurant1000.
https://en.wikipedia.org/wiki/Single_instruction%2C_multiple_data
A similar problem set could be Editors and News Articles.
1000 News Reporters and 50 editors.
All 50 Editors looking at the same document could reduce editor throughput yet is all 50 editors take 20 articles potential for much faster editing throughput.
Overlap is costly, and sometimes necessary. If the 50 editors were broken into 2 groups potential for each editor to be required to read 40 documents yet that would allow verification.
25 editors read all the documents (40 each) then second set of editors verifies first editors results (40 each). Complexity that increases Throughput has potential need for greater verification.
Complexity is not cost free. More processing cores used on a problem, potential for problems that can’t always be seen, more heat on the chip, that might be generated via larger problem sets that can’t always be seen in normal operation that might only show up after hours of processing.
More moving parts including CPU cores potential for more that can go right, and potential for more that can go wrong. Potential for more concentrated liability as well. One computer can do the same thing as 20 can equal if that one machine fails it is more likely 20 computers failing than one failing.
All that said with the problem set chosen right, 1000 processors computing has the potential to give a head start searching for something in very powerful ways. If it takes one processor 5 hours to search a document, that document split up in 1000 pieces and searched (assuming it is not a file that cannot be broken up) can lead to a seek time close to 5/1000. 300 minutes / 1000 processors = 0.3 minutes. What would take one processor 5 hours to find could now be found in 0.3 minutes.
A cost of 0.3 minutes versus 5 hours changes what can be done over a 5 hour period. Potential for 900 of the same problems to be solved in same amount of time assuming all those problems were lined up in a useful way. Or potential for swiftly finding (0.3 minutes) then that same processor useful 1 hour later for searching swiftly on a new problem that would otherwise still be being used on the last problem.
Problems solved faster has the potential to dramatically increase Throughput.
Important to remember juggling tasks has potential to increase complexity. Just because I have more than one computer does not equal there is zero cost for video editing on another device and hitting the export button. Still requires a non null time cost to load a document into a program, make changes and then hit export (the process that can be let to work on its own).
Improved focus is power, power that is not always fully appreciated.
Working on the task at hand, reducing external distractions factors into ability to process data and complete tasks successfully. Increasing complexity and requirements has potential to dramatically decrease throughput in non ideal ways.
Interrupts while sometimes necessary (burning building, much higher priority item like need to call 911) does not equal they are low cost for completing a task. Tornadoes are a rare system that have to be factored into potential interrupts. Unlikely does not equal zero potential for affecting throughput.
There are many libraries in the World, unlikely to be in a library that is burning down, unlikely to be in a library in the middle of a Tornado Watch/Warning. Able to process interrupts is important, though unlikely factors into greater ability to focus on task at hand.
A Tornado Siren is Powerful, and Loud, likely to be heard.
Someone screaming in silence about Oppression? Less likely to be heard.
Interrupts that factor into faster problem solving and Throughput are not always heard
Faster Systems for verification can be a lot of Power Not Always Fully Pondered. Faster something can be verified, faster it can be run with, sometimes in non ideal ways.