<<Prev Next>> Scroll to Bottom
Stuff goes here
L1[00:10:21] ⇦ Quits: Fluburtur (~Fluburtur@2a01:e0a:d31:fe30:919e:aca6:c268:c643) (Read error: Connection reset by peer)
L2[00:32:38] ⇨ Joins: Wastl2 (~Wastl2@dynamic-078-055-181-239.78.55.pool.telefonica.de)
L3[00:34:44] ⇦ Quits: Wastl4 (~Wastl2@dynamic-077-011-150-192.77.11.pool.telefonica.de) (Ping timeout: 190 seconds)
L4[01:12:55] ⇦ Quits: fkrauthan (~fkrauthan@orca.servers.cogindo.tech) (Quit: ZNC - https://znc.in)
L5[01:13:39] ⇨ Joins: fkrauthan (~fkrauthan@orca.servers.cogindo.tech)
L6[01:17:55] ⇦ Quits: fkrauthan (~fkrauthan@orca.servers.cogindo.tech) (Client Quit)
L7[01:20:47] ⇨ Joins: fkrauthan (~fkrauthan@orca.servers.cogindo.tech)
L8[01:41:02] ⇨ Joins: purpletarget|ktns (~purpletar@d207-81-104-80.bchsia.telus.net)
L9[01:41:03] ChanServ sets mode: +o on purpletarget|ktns
L10[02:42:04] ⇦ Quits: _whitelogger (~whitelogg@uruz.whitequark.org) (Remote host closed the connection)
L11[02:42:18] ⇨ Joins: _whitelogger (~whitelogg@uruz.whitequark.org)
L12[02:57:02] ⇦ Quits: darsie (~darsie@84-112-12-36.cable.dynamic.surfer.at) (Ping timeout: 183 seconds)
L13[02:59:12] ⇦ Quits: Tank2333 (~Tank2333@p5ddaa43c.dip0.t-ipconnect.de) (Read error: Connection reset by peer)
L14[03:53:38] ⇨ Joins: Althego (~Althego@86FF52B5.unconfigured.pool.telekom.hu)
L15[03:53:38] ChanServ sets mode: +o on Althego
L16[05:57:03] ⇦ Quits: _whitelogger (~whitelogg@uruz.whitequark.org) (Remote host closed the connection)
L17[05:57:19] ⇨ Joins: _whitelogger (~whitelogg@uruz.whitequark.org)
L18[06:48:34] ⇦ Quits: Ezriilc (~Ezriilc@syn-096-059-122-065.res.spectrum.com) (Remote host closed the connection)
L19[07:12:05] ⇦ Quits: purpletarget|ktns (~purpletar@d207-81-104-80.bchsia.telus.net) (Quit: Any fool can calculate)
L20[07:24:52] ⇨ Joins: purpletarget|ktns (~purpletar@d207-81-104-80.bchsia.telus.net)
L21[07:24:52] ChanServ sets mode: +o on purpletarget|ktns
L22[07:55:36] ⇨ Joins: Deddly (~Deddly@193.181.34.88)
L23[07:55:36] ChanServ sets mode: +o on Deddly
L24[07:55:59] <Deddly> !nextlunch
L25[07:56:00] <LunchBot> Mega Bacon.
L26[07:56:22] <Deddly> I mean... I know that bacon is good. Mega bacon can be nothing but a success
L27[08:16:09] <Deddly> !mission
L28[08:16:10] <LunchBot> You attempt to insert a nuclear ramjet probe into Jool's atmosphere. You land with so much force the Mun disintegrates.
L29[08:16:39] <Deddly> Now that's a LOT of force. I should probably not have eaten so much bacon before planning that mission.
L30[08:41:37] ⇦ Quits: ergZay (~ergZay@c-73-162-189-54.hsd1.ca.comcast.net) (Quit: This computer has gone to sleep)
L31[08:47:53] ⇨ Joins: sandbox (~sandbox@host-78-147-177-116.as13285.net)
L32[09:01:57] <Mat2ch> Today I learnt that there are 16 DTMF tones and not just 12
L33[09:02:26] <Mat2ch> and that DTMF tones are used over amateur radio
L34[09:36:30] <Deddly> Cool. What do they use them for?
L35[09:37:26] <Mat2ch> remote controlling stuff
L36[09:38:08] <Mat2ch> I guess it's rather easy to build some analog circuit to recognize a dtmf tone and then switch something
L37[09:44:53] ⇨ Joins: darsie (~darsie@84-112-12-36.cable.dynamic.surfer.at)
L38[09:50:47] <Deddly> I guess in the near future people will take the easy way out and install an AI they can communicate with in plain language.
L39[10:09:16] <Mat2ch> and have another AI to translate the machine bits to words?
L40[10:10:17] <Mat2ch> One thing I've learnt over my years experimenting with voice control is: I can press buttons much faster than talking. And it's much more reliable.
L41[10:11:43] ⇦ Quits: armed_troop (znc@2600:4040:712b:cb00:4eed:fbff:fe44:b834) (Ping timeout: 190 seconds)
L42[10:19:52] <Deddly> Very true
L43[10:20:31] <Deddly> Voice control is can be very useful in certain situations, but unless it's close to 100% reliable, it's just really annoying.
L44[10:21:12] <Mat2ch> I always look to StarTrek TNG when it comes to voice controls. They did a fairly good job with it.
L45[10:21:27] <Mat2ch> They mix it between buttons and voice control.
L46[10:21:44] <Deddly> Yeah and I remember laughing at how unrealistic that was. "Will never be that accurate"
L47[10:22:26] <Mat2ch> Was already that accurate around 2000 with IBMs natural voice. But they asked such a high price for it, nobody outside of a professional business wanted to use it.
L48[10:22:26] <Pinkbeast> Although IDK why the interface has to make a _noise_ for everything
L49[10:22:41] <sandbox> I refuse to use voice for any device
L50[10:22:43] <Mat2ch> For the viewers to "see" that something is happening.
L51[10:22:46] <Deddly> And how they could program the holodeck by telling it "put a table in the middle of the room". We're already close to that with AI-generated art
L52[10:23:44] <Mat2ch> yes and no. In StarTrek the computer had an understanding of what a table is. So the table wasn't some dreamed up thing, but a sound table, with square edges, four legs, etc.
L53[10:23:47] <Pinkbeast> And like "AI" now it often backfires horribly
L54[10:23:51] <Deddly> I laughed at how the computer was able to parse Scotty's "Shut this bloody thing off". But GPT has no problem with that
L55[10:24:29] <Mat2ch> The problem with modern AI is: It doesn't understand a thing. It just gives you the most likely thing it learnt.
L56[10:24:49] <Deddly> AI is already backfiring, Pinkbeast. It is used extensively to spam online discussions and shape opinions.
L57[10:25:00] <Pinkbeast> Also Star Trek's computer can reliably distinguish "Eject the warp core!" from "well, I suppose if that did happen we could eject the warp core"
L58[10:25:09] <Pinkbeast> Deddly: er yeah, that's what I said :-)
L59[10:25:42] <Deddly> Pinkbeast, so you did. I think AI would have parsed that better than I...
L60[10:25:54] <Pinkbeast> The holodeck hasn't killed so many people, either
L61[10:26:08] <Deddly> Has AI killed people?
L62[10:26:40] <Mat2ch> Red shirt in the holodeck: "Haha, think about how dangerous this would be if we said: Disable the safeties"
L63[10:26:51] <Mat2ch> ChatGPT: "Safties are now disabled."
L64[10:26:56] <Pinkbeast> Yes. Consider, for example, completely confident answers to mushroom identification from a machine that knows nothing
L65[10:27:45] <Deddly> Ah, mushroom pickers.
L66[10:28:06] <Pinkbeast> Or, say, electrical safety, fire safety...
L67[10:28:27] <Deddly> Fun fact, I actually identified an edible mushroom with the help of Google Lens. But of course I researched it thoroughly before eating it.
L68[10:28:57] ⇨ Joins: armed_troop (znc@2600:4040:712b:cb00:4eed:fbff:fe44:b834)
L69[10:29:10] <Pinkbeast> And (if you don't do the second bit) it's only got to get it wrong once
L70[10:29:10] <Deddly> Mat2ch, I think ChatGPT would actually get that one right. It's uncannily good at recognising intended meaning.
L71[10:30:25] <Deddly> Just like any other powerful tool, AI can be dangerous if used carelessly
L72[10:30:40] ⇦ Quits: King_Arthur (dalewyn@63.155.1.224) (Ping timeout: 183 seconds)
L73[10:30:59] <Pinkbeast> Well, except most powerful tools have an intended purpose that isn't making things worse
L74[10:31:19] <Deddly> I do confess I laughed at The Musk when we was warning people about the danger of AI a few years ago. What it is capable of doing was inconceivable to be back then.
L75[10:31:29] <Pinkbeast> "AI" is a bit like a chain of handgun-and-tequila shops
L76[10:31:30] <Deddly> we was = he was
L77[10:31:45] <Pinkbeast> I'm still laughing at him because he's going on about AGI which is no closer than before
L78[10:32:01] <Pinkbeast> The only similarity is the name
L79[10:32:07] ⇨ Joins: King_Arthur (dalewyn@63.155.5.207)
L80[10:32:29] <Deddly> He wasn't talking about AGI in the tweets I'm thinking of.
L81[10:32:50] <Pinkbeast> To me a decent chunk of the damage done by "AI" is making the AGI paranoia seem more credible, up to and including making the "EA" scam more effective
L82[10:33:19] <Deddly> What is "EA" in this context?
L83[10:33:23] ⇦ Quits: SporkWitch (~SporkWitc@syn-098-010-053-227.res.spectrum.com) (Ping timeout: 190 seconds)
L84[10:33:33] <Pinkbeast> "Effective altruism"
L85[10:33:42] ⇨ Joins: SporkWitch (~SporkWitc@syn-098-010-053-227.res.spectrum.com)
L86[10:34:35] <Pinkbeast> The idea that you should give money to the _best_ charitable causes (so far so good); that AGI is the biggest threast to humanity by a huge margin (hm); and so you should give all your money to whoever is doing the most to fight it, which _by a happy coincidence_ is the person explaining all this to you
L87[10:35:06] <Deddly> Ah
L88[10:35:20] <Pinkbeast> Happy for them, that is. :-)
L89[10:35:29] <Deddly> Take my credit card
L90[10:36:20] <Pinkbeast> EA kind of exploded because it turned out they were intimately intertwined with FTX but now it's coming back, like kudzu
L91[10:37:26] <Pinkbeast> (I'm not sure one can infer much about what Musk thinks from his tweets because he is one of those liars untroubled by consistency, but fair enough)
L92[10:38:40] <Deddly> I mean, people were joking about Skynet etc. and he responded with what the real threat was, which is what we are already seeing today.
L93[10:39:48] <Deddly> That's what he said. As to what he was actually THINKING, well...
L94[10:44:15] <Deddly> Thinking about Star Trek, I think it's interesting how in some episodes they make a big point about how humans are needed and that AI can't do what humans do, and then we all cheer on when Data does something superhuman.
L95[10:45:41] <Pinkbeast> ST also not always troubled by consistency :-)
L96[10:49:42] <Deddly> ST?
L97[10:49:51] <Deddly> Oh wait
L98[10:49:55] <Deddly> GOt it
L99[10:50:37] <Deddly> Acronym cognition failure
L100[11:02:43] ⇨ Joins: Fluburtur (~Fluburtur@2a01:e0a:d31:fe30:9933:6014:8ca1:ef86)
L101[11:06:42] <KrazyKrl> ;outcome add Critical initialism failure. Aborting.
L102[11:06:42] <LunchBot> Added outcome: Critical initialism failure. Aborting.
L103[11:09:41] <KrazyKrl> ;mission add You unironically watch "Battlefield Earth."
L104[11:09:42] <LunchBot> Added mission: You unironically watch "Battlefield Earth."
L105[11:32:48] <Deddly> ;outcome add ChatGPT convinces you that everything is fine and engages you in a conversation about stylish industrial lighting.
L106[11:32:48] <LunchBot> Added outcome: ChatGPT convinces you that everything is fine and engages you in a conversation about stylish industrial lighting.
L107[11:37:00] <darsie> 1337
L108[11:45:32] <Mat2ch> Deddly: Musk warned about the dangerous of an AGI. Which I'm still find not possible
L109[11:47:23] <Mat2ch> uh, what was the movie called where a gunship gets attacked by aliens?
L110[12:06:09] <Pinkbeast> Battleship?
L111[12:07:35] <Pinkbeast> (I also think there's a common bogus position that "AI" is going to revolutionise the world rather than just make it a bit worse, like cryptocurrency)
L112[12:08:15] <Mat2ch> I think that's it. Always wanted to view that one, but with popcorn. Sweet popcorn. :P
L113[12:09:16] <Mat2ch> The real problem with AI right now is that there are huge investors with too much money not knowing what to do with it anymore (they already caused the housing crisis by buying up every square meter they could find) and now have to throw their money at the next big thing
L114[12:09:45] <Mat2ch> crypto was so big, because there was too much money to throw around and with AI this is happening again.
L115[12:10:47] <Mat2ch> And sadly most of it makes it look bad and we don't see the good things neural nets can do, like finding cancer spots in CT images far faster and more reliable than humans
L116[12:11:17] <Mat2ch> or folding proteins to create new drugs or better understanding of how proteins work
L117[12:13:23] <darsie> Or understanding the DNA code and creating better humans. Or reverse engineering machine code, finding (exploitable) bugs.
L118[12:20:38] <Eddi|zuHause> !mission
L119[12:20:38] <LunchBot> You try to play Buzz Aldrin's Space Program. You gain negative Kardashev progress!
L120[12:31:26] ⇦ Quits: Deddly (~Deddly@193.181.34.88) (Quit: Uh oh...)
L121[12:35:14] <Mat2ch> yeah, but what everyone is talking about are assistance most of us don't need, so they can gather even more personal information, which they then sell.
L122[12:37:03] <darsie> log(0.1)=-1. The Kardashev scale might well be logarithmic.
L123[12:49:48] <Eddi|zuHause> my short wikipedia investigation mentioned that it can be extrapolated logarithmically
L124[12:53:11] <Eddi|zuHause> the thing is, a hypothetical type-2 civilisation could be undetectable by visual telescopes
L125[13:11:30] <darsie> The system would ideally just emit IR.
L126[13:11:46] <darsie> BUtwe have the Webb.
L127[13:21:28] ⇨ Joins: Ezriilc (~Ezriilc@syn-096-059-122-065.res.spectrum.com)
L128[13:48:23] <Eddi|zuHause> i made a mistake... i visited the cities skylines workshop, and now i need triple the memory i have...
L129[13:51:58] ⇦ Quits: Severian (~severian@c-76-130-56-58.hsd1.co.comcast.net) (Read error: Connection reset by peer)
L130[13:52:11] ⇨ Joins: Severian (~severian@c-76-130-56-58.hsd1.co.comcast.net)
L131[14:18:13] ⇨ Joins: ergZay (~ergZay@c-73-162-189-54.hsd1.ca.comcast.net)
L132[15:21:03] ⇦ Quits: _whitelogger (~whitelogg@uruz.whitequark.org) (Remote host closed the connection)
L133[15:21:20] ⇨ Joins: _whitelogger (~whitelogg@uruz.whitequark.org)
L134[15:50:15] <FLHerne> .mission add You spend two years developing a mod to turn KSP into a science-based, 100% dragon MMO.
L135[15:50:15] <LunchBot> Added mission: You spend two years developing a mod to turn KSP into a science-based, 100% dragon MMO.
L136[16:04:37] <Eddi|zuHause> that's not a very long time, considering
L137[16:04:46] <Eddi|zuHause> !mission
L138[16:04:46] <LunchBot> You add some communism. The outcome is chilling, like a piano made out of frozen Windex.
L139[16:05:56] <Eddi|zuHause> "what happens when you introduce communism to KSP? nothing for a long time, and then you run out of explosions"
L140[16:11:43] <FLHerne> Eddi|zuHause: https://www.reddit.com/r/gaming/comments/p1ssv/dear_internet_im_a_26_year_old_lady_whos_been/
L141[17:27:03] ⇦ Quits: _whitelogger (~whitelogg@uruz.whitequark.org) (Remote host closed the connection)
L142[17:27:18] ⇨ Joins: _whitelogger (~whitelogg@uruz.whitequark.org)
L143[17:28:41] ⇦ Quits: nshire (~neal@47.146.42.9) (Read error: Connection reset by peer)
L144[17:29:37] ⇨ Joins: nshire (~neal@47.146.42.9)
L145[17:31:24] ⇦ Quits: minas_tirith (~somebuddy@pika.powered.by.lunarbnc.net) (Quit: bye)
L146[17:31:49] ⇨ Joins: minas_tirith (~somebuddy@pika.powered.by.lunarbnc.net)
L147[17:35:53] ⇦ Quits: wallacer (~quassel@2605:6400:d0c6:1000::2) (Ping timeout: 183 seconds)
L148[17:48:27] <FLHerne> .outcome add The resulting craft is compact as bread, yet strong as unstoppable!
L149[17:48:27] <LunchBot> Added outcome: The resulting craft is compact as bread, yet strong as unstoppable!
L150[17:55:07] ⇦ Quits: Severian (~severian@c-76-130-56-58.hsd1.co.comcast.net) (Read error: Connection reset by peer)
L151[17:55:43] ⇨ Joins: Severian (~severian@c-76-130-56-58.hsd1.co.comcast.net)
L152[18:05:28] <Eddi|zuHause> !outcome add Whatever your intentions were, they have now been twisted beyond recognition.
L153[18:05:28] <LunchBot> Added outcome: Whatever your intentions were, they have now been twisted beyond recognition.
L154[18:26:37] <sandbox> !mission
L155[18:26:38] <LunchBot> You research quantum flubber sneakers. Reflected engine heat melts a hole in your fuel tank. A third of your fuel gets overexited and migrates to the nearest planet to mate.
L156[18:27:09] <sandbox> hmm
L157[18:41:47] <XXCoder> you broke reality with flubber
L158[18:42:11] <XXCoder> ;mission
L159[18:42:11] <LunchBot> You must develop a reusable transport and lander for the local Kerbin system. Your spaceplane breaks the runway.
L160[18:42:21] <XXCoder> too heavy then
L161[18:57:15] ⇦ Quits: raptop (~Newpa_Has@ip98-169-23-98.dc.dc.cox.net) (Ping timeout: 207 seconds)
L162[19:07:39] <Eddi|zuHause> i hate when that happens
L163[20:00:34] ⇦ Quits: Pytagoras (~Pytagoras@ti0037a400-1685.bb.online.no) (Read error: Connection reset by peer)
L164[20:01:14] ⇨ Joins: Pytagoras (~Pytagoras@ti0037a400-1685.bb.online.no)
L165[20:04:07] ⇨ Joins: raptop (~Newpa_Has@129.174.182.53)
L166[20:04:07] ChanServ sets mode: +o on raptop
L167[20:08:16] ⇨ Joins: wallacer (~quassel@104.244.72.233)
L168[20:32:55] <Eddi|zuHause> !mission
L169[20:32:55] <LunchBot> Your rocket development is plagued by side-fumbling of the waneshaft. Best practice is to suppress this with hydrocoptic marzlevanes, but you don't know what they are. StackDoubleFlow develops a mod which speeds up the game roughly 10-20,000x. That brings the framerate to a much more reasonable 0.1fps, so the launch timelapse only takes 9 hours to record.
L170[20:33:49] <Eddi|zuHause> but did they implement the marzlevanes
L171[20:38:24] <raptop> They gave up on turboencabulators entirely
L172[20:39:01] <XXCoder> all kinds of stuff I dont know about lol
L173[20:40:52] <raptop> Anyway, you should look up some turboencabulator videos. They're classics like how the missile knows where it is
L174[20:44:34] <Eddi|zuHause> the missile thing reminds me of a really old comedy where they made square wheels, so they can't roll backwards
L175[20:44:45] ⇦ Quits: Althego (~Althego@86FF52B5.unconfigured.pool.telekom.hu) (Quit: HMI Module Alpha Humana on approach to Space Station Mercury)
L176[20:47:29] <raptop> Anyway, the turboencabulator spiel is just a bunch of technobabble, and probably one of the earliest examples of it
L177[20:47:55] <raptop> Oh, hey. It's no 80 years old: https://en.wikipedia.org/wiki/Turbo_encabulator
L178[21:32:48] <Eddi|zuHause> what was hat device called in "dude, where's my car?"
L179[21:32:55] <Eddi|zuHause> *that
L180[21:38:02] <raptop> Haven't seen that movie
L181[21:38:27] <Eddi|zuHause> just looked it up, it was "continuum transfunctioner"
L182[21:48:36] <Eddi|zuHause> sometimes i wish i knew enough about linux internals to teach the swap about keeping certain processes in memory and forcing others to swap more
L183[21:49:46] <Eddi|zuHause> probably an utopian thought, as it's not even trivial to find out which process uses how much memory
L184[22:16:44] ⇦ Quits: sandbox (~sandbox@host-78-147-177-116.as13285.net) (Quit: Leaving)
L185[22:30:42] ⇦ Quits: minas_tirith (~somebuddy@pika.powered.by.lunarbnc.net) (Quit: bye)
L186[22:31:10] ⇨ Joins: minas_tirith (~somebuddy@pika.powered.by.lunarbnc.net)
L187[22:42:47] ⇦ Quits: Severian (~severian@c-76-130-56-58.hsd1.co.comcast.net) (Ping timeout: 183 seconds)
L188[22:42:56] ⇨ Joins: Severian (~severian@c-76-130-56-58.hsd1.co.comcast.net)
L189[22:52:04] ⇦ Quits: raptop (~Newpa_Has@129.174.182.53) (Ping timeout: 190 seconds)
L190[23:22:07] ⇨ Joins: raptop (~Newpa_Has@ip98-169-23-98.dc.dc.cox.net)
L191[23:22:07] ChanServ sets mode: +o on raptop
<<Prev Next>> Scroll to Top