• 1 Post
  • 55 Comments
Joined 1 year ago
cake
Cake day: July 1st, 2023

help-circle



  • When I think of “stuffing” I think of people creating wholly illegitimate ballots, which does not seem to be what happened here.

    That’s exactly what ballot stuffing is and why what these folks are charged with is not ballot stuffing:

    https://ballotpedia.org/Ballot_stuffing

    Ballot stuffing or ballot box stuffing is a form of electoral fraud in which a greater number of ballots are cast than the number of people who legitimately voted. The term refers generally to the act of casting illegal votes or submitting more than one ballot per voter when only one ballot per voter is permitted.[1]

    If the absentee ballots they handled were either fabricated or if the voters they were from already voted, then yes it would be “ballot stuffing” but I didn’t see that in the article. Just “mishandling”.

    Still best that absentee ballots are handled properly as to show the voter hasn’t voted in person.



  • Do a search for you server OS + STIG

    Then, for each service you’re hosting on that server, do a search for:

    Service/Program name + STIG/Benchmark

    There’s tons of work already done by the vendors in conjunction with the DoD (and CIS) to create lists of potential vulnerable settings that can be corrected before deploying the server.

    Along with this, you can usually find scripts and/or Ansible playbooks that will do most of the hardening for you. Though it’s a good Idea to understand what you do and do not need done.








  • Frametimes is the specific measure.

    <11.1ms for 90Hz or <8.33ms for 120Hz

    If the game, experience, or whatever breaches that minimum frame time frequently, then you can experience nausea just from moving your head around.

    It does require some sacrifices like turning shadows down a notch or two in some game engines and choosing additional visual effects carefully. Some visual effects require additional computation passes and can add the the frame time.

    A low latency CPU (like the AMD 3D cache CPUs) or a normal mid to high end CPU with fast memory with good timings helps quite a bit.

    The GPU should be capable of pushing the pixels and shading for the target resolution. Even with a 6900xt I’ve been able to comfortably push over 4500x3000 per eye rendering (enough to get a nice anti-aliasimg effect on my Pimax 8kX at the “normal” 150 degree H.FoV) in most games.

    Surprisingly, fidelity FX can help as well (the non-temporal version).



  • Would be nice if the author had done a bit of research on the specific things that had been done in VR since he tried his DK2 to prevent nausea:

    An Oculus DK2, a PC that couldn’t quite run a rollercoaster demo at a high-enough framerate, and a slightly-too-hot office full of people watching me as I put on the headset. Before I’d completed the second loop-de-loop, it was clear that VR and I were not going to be good friends.

    The study the author quotes dates to August 2019!

    https://insidescience.org/news/cybersickness-why-people-experience-motion-sickness-during-virtual-reality

    For one, non-persistent displays have become the norm. These only show (strobe) the image for a fraction of the frame time and go black in between. Valve discovered that the full 1/90th of a second an image is displayed is enough to induce nausea if the head is moving during that time. So the Vive (and the Oculus Rift) had non-persistent displays.

    The stobing effect is so fast you don’t notice it.

    Elimination of artificial movement is another. The reason Valve focused on games with teleport movement and made a big deal of “room scale” early on was to eliminate the nausea triggers you encounter in other types of experiences.

    Valve had an early version of Half Life 2 VR during the days of the DK2, but they removed it as the artificial motion made people sick (myself included).

    For many, sims work as long as there is a frame in their field of vision to let their brains lock into that non-moving frame of reference (ex car A-pillars, roof line, dash board, outline of view screen on a ship interior, etc). Note the frame still moves when you move your head, so it’s not a static element in your field of view.

    Also it helps if your PC can render frames under the critical 11.1ms frame time (for 90Hz displays). Coincidentally, 90Hz is the minimum Valve determined is needed to experience “presence”. Many folks don’t want to turn down graphic options to get to this. It’s doable in most games even if it won’t be as detailed as it would on a flat screen. Shadows is a big offender here.

    Resolution isn’t as big of a factor in frametimes as detailed shadows and other effects. I have run games at well over 4k x 2.5k resolution per eye and been able to keep 11.1ms frame times.

    Lastly, it has been noted that any movement or vibration to the inner ear can for many stave off nausea. This includes jogging in place while having the game world move forward. For many years we’ve had a free solution that integrates into Steam VR:

    https://github.com/pottedmeat7/OpenVR-WalkInPlace

    Jog in place to make your character move forward in the direction you’re facing. Walk normally to experience 1-to-1 roomscale.

    I’ve use the above to play Skyrim VR without any nausea. Good workout too!

    For car, flight, spaceflight simulators, a tactile transducer on your chair (looks like a speaker magnet without the cone - or basically a subwoofer without the cone) can transfer the games sound vibrations directly to you and therefore your inner ear and prevent nausea.

    I’ve literally played over 1,000 hours of Elite:Dangerous this way as well as Battlezone VR and Vector 36. All games that involve tons of fast artificial movement.

    The main issue is too many people tried out VR cardboard or old DK2 demos with low and laggy framerate, persistent displays, and poorly designed VR experiences and simply write off all VR as bad and nausea inducing.

    Edit: added links and trailers to the games mentioned so folks can see the motion involved. The “study” wasn’t a proper study. It was a quote from a scientist. No data was given about what headsets or which experiences caused nausea.


  • As a layperson reading through this it seems to me the biomarker is related to the metabolic activity of a specific brain region called the subcallosal cyngulate gyrus. They used imaging techniques vs say blood tests, or depression questionnaires to identify the changes from the treatments.

    It’s confusing because the study linked above is first talking about the deep brain stimulation which was the main focus of the study (and how they effected the anti-depressant changes).

    This other link I found is more to the point:

    The subcallosal cingulate gyrus (SCG), including Brodmann area 25 and parts of 24 and 32, is the portion of the cingulum that lies ventral to the corpus callosum. It constitutes an important node in a network that includes cortical structures, the limbic system, thalamus, hypothalamus, and brainstem nuclei. Imaging studies have shown abnormal SCG metabolic activity in patients with depression, a pattern that is reversed by various antidepressant therapies.

    https://www.biologicalpsychiatryjournal.com/article/S0006-3223(10)01003-6/fulltext

    If the imaging is easy to do and interpret, then it would allow for a more objective way to measure the effectiveness of other depression treatments.

    As a roadmap of sorts, the corpus callosum is the part of the brain that connects the two hemispheres. Knowing that look at:

    https://en.wikipedia.org/wiki/Cingulate_cortex

    Edit 2:

    This highlits the specific areas (side view) that the paper in the OP is referring to:

    https://www.researchgate.net/figure/The-anterior-cingulate-cortex-ACC-consists-of-subgenual-sgACC-perigenual-pgACC-and_fig1_351484383

    Depth wise, the regions you see are basically in the middle of the brain (not left or right) where the two hemispheres touch, towards the front of your head, and right above the corpus callosum (the bundle that connect the two hemispheres).

    Basically, put the tip of your index finger above your eyebrows centered above your nose and you’re pointing the the right region though it looks like it’s buried an inch or two deep.




  • I know on the hearing aides themselves it’s called “Zen” which includes white noise options, narrowband white noise, and the fractal tones. I also think there’s an option for combining white noise with fractal tones. Don’t know if there is a “notched therapy” option (play white noise or other sounds but excluding the frequency of your tinnitus.

    The fractal tones can also be tuned by average frequency and the number of tones played per time period per channel. I know mine plays more tones on the ear opposite where my tinnitus is.

    I’ll post another reply if I can confirm a good fractal tones app. I did a short search in the past but gave up when I came up empty.