code logs -> 2016 -> Tue, 28 Jun 2016< code.20160627.log - code.20160629.log >
--- Log opened Tue Jun 28 00:00:40 2016
00:52 Derakon[AFK] is now known as Derakon
01:25 himi [sjjf@Nightstar-dm0.2ni.203.150.IP] has joined #code
01:25 mode/#code [+o himi] by ChanServ
02:31 Reiv [NSwebIRC@Nightstar-sotclq.xtra.co.nz] has quit [Ping timeout: 121 seconds]
03:35 Reiv [NSwebIRC@Nightstar-q8avec.kinect.net.nz] has joined #code
03:36 mode/#code [+o Reiv] by ChanServ
04:18 Turaiel is now known as Turaiel[Offline]
05:09 Derakon is now known as Derakon[AFK]
05:41 crystalclaw|AFK is now known as crystalclaw
06:10 Netsplit Deepthought.Nightstar.Net <-> Krikkit.Nightstar.Net quits: @PinkFreud
06:24 Reiv [NSwebIRC@Nightstar-q8avec.kinect.net.nz] has quit [Ping timeout: 121 seconds]
07:22 Kindamoody[zZz] is now known as Kindamoody
07:53 Shady [ShadyGuru@Nightstar-qfckjl.tv13.ptd.net] has quit [[NS] Quit: Yay, he's gone]
08:01 crystalclaw is now known as crystalclaw|AFK
08:06
<@abudhabi>
JSTL. How do I format a currency number (using fmt:formatNumber here), so that it has decimal places if it's got any fractional stuff, and not show them if it's round?
08:09 Kindamoody is now known as Kindamoody|out
08:18
<~Vornicus>
looks like maybe something like 0.##?
08:18 celticminstrel is now known as celmin|away
08:20
<~Vornicus>
On Excel, which appears to use a similar format, that leave the decimal point but not the places.
08:20
<~Vornicus>
mm, try just setting max fraction digits?
08:20
<@abudhabi>
Vornicus: Almost. I want 10 -> 10, 10.1 -> 10.10 and 10.11 -> 10.11.
08:20
<~Vornicus>
oh that I have *no* idea how I'd do
08:21 himi [sjjf@Nightstar-dm0.2ni.203.150.IP] has quit [Ping timeout: 121 seconds]
08:21
<~Vornicus>
in excel (which I know my way around), or otherwise
09:52 Emmy [M@Nightstar-9p7hb1.direct-adsl.nl] has joined #code
10:27
<@abudhabi>
Anyone here ever use https://pypi.python.org/pypi/irc ?
10:44 himi [sjjf@Nightstar-v37cpe.internode.on.net] has joined #code
10:56 You're now known as TheWatcher[d00m]
11:07 Alek [Alek@Nightstar-9qtiqv.il.comcast.net] has quit [Ping timeout: 121 seconds]
11:10 Alek [Alek@Nightstar-9qtiqv.il.comcast.net] has joined #code
11:11 Vornicus [Vorn@ServerAdministrator.Nightstar.Net] has quit [Connection closed]
11:26
<@ErikMesoy>
Does future-proofing and modularization count as optimization for purposes of premature optimization?
11:45
<&[R]>
No
11:45
<&[R]>
The latter usually negatively impacts optimization
11:56 You're now known as TheWatcher
12:26 catadroid [catadroid@Nightstar-2k0p46.dsl.teksavvy.com] has joined #code
13:12 mode/#code [+o Alek] by ChanServ
13:13 PinkFreud [WhyNot@NetworkAdministrator.Nightstar.Net] has joined #code
13:13 mode/#code [+o PinkFreud] by ChanServ
13:56 celmin|away is now known as celticminstrel
14:08 catadroid [catadroid@Nightstar-2k0p46.dsl.teksavvy.com] has quit [Connection closed]
14:26 catadroid [catadroid@Nightstar-2k0p46.dsl.teksavvy.com] has joined #code
14:27 catadroid` [catadroid@Nightstar-2k0p46.dsl.teksavvy.com] has joined #code
14:27 catadroid [catadroid@Nightstar-2k0p46.dsl.teksavvy.com] has quit [Connection closed]
14:29 catadroid [catadroid@Nightstar-2k0p46.dsl.teksavvy.com] has joined #code
14:29 catadroid` [catadroid@Nightstar-2k0p46.dsl.teksavvy.com] has quit [Connection closed]
14:35 catadroid [catadroid@Nightstar-2k0p46.dsl.teksavvy.com] has quit [Connection closed]
14:38 catadroid [catadroid@Nightstar-2k0p46.dsl.teksavvy.com] has joined #code
14:50 catadroid [catadroid@Nightstar-2k0p46.dsl.teksavvy.com] has quit [Connection closed]
17:28 catadroid [catadroid@Nightstar-2k0p46.dsl.teksavvy.com] has joined #code
17:32 Vornicus [Vorn@ServerAdministrator.Nightstar.Net] has joined #code
17:32 mode/#code [+qo Vornicus Vornicus] by ChanServ
18:56
<~Vornicus>
shit, low might learn haskell before me. better get cracking
19:01 Shady [ShadyGuru@Nightstar-qfckjl.tv13.ptd.net] has joined #code
19:01 crystalclaw|AFK is now known as crystalclaw
19:09
<@pjdelport>
You never finish learning Haskell.
19:20
<&McMartin>
Yeah, but you hit various tiers
19:25
<@pjdelport>
and or tears
19:26
<&McMartin>
You need more absinthe^Wmonads
19:34
<@pjdelport>
Functors all the way
19:35
<@abudhabi>
What's a good software for webcam recording?
19:35
<&McMartin>
OBS is trying to solve this very general class of problem
19:35
<&McMartin>
It'll let you mix webcam and screencap
19:35
<&McMartin>
But you can just do the one or the other too
19:41
<@abudhabi>
Hm. My webcam seems to be capturing sound but not video.
19:56
<@celticminstrel>
Absinthe-flavoured monads?
20:01
<&McMartin>
celticminstrel: I spent a weekend trying to really get a handle on monads as an abstraction once, which I at the time claimed was the CS equivalent of going on an absinthe bender
20:01
<@celticminstrel>
I... see?
20:02
<&McMartin>
It turns out a significant fraction of my confusion was that the things that I had learned as "monads" in my abstract algebra classes were what Haskell calls monoids.
20:03
<@abudhabi>
Hm. OBS has failed to properly get camera video. But ManyCam can get it right. Unfortunately, the free option sucks.
20:06
<@ErikMesoy>
something something endofunctors
20:15 catadroid` [catadroid@Nightstar-2k0p46.dsl.teksavvy.com] has joined #code
20:15 catadroid [catadroid@Nightstar-2k0p46.dsl.teksavvy.com] has quit [Connection closed]
20:16 VirusJTG [VirusJTG@Nightstar-6i5vf7.sta.comporium.net] has quit [Connection closed]
20:17
<@pjdelport>
Monads are really a bit of a red herring in Haskell.
20:18
<@gnolam>
ErikMesoy: Cadaveric incubator of endofunctors?
20:18
<@pjdelport>
Functors are much more important, fundamental, and mind-blowing IMHO.
20:18
<@pjdelport>
And under-appreciated!
20:19
<&McMartin>
I think I'm less impressed by Functors because they existed in a broader set of guises
20:19
<@pjdelport>
Even now, years later, most of the fuss people make about monads is actually about functors / applicative functors.
20:19 VirusJTG [VirusJTG@Nightstar-6i5vf7.sta.comporium.net] has joined #code
20:19 mode/#code [+ao VirusJTG VirusJTG] by ChanServ
20:19
<&McMartin>
Applicative Functors were much more fun but also seem more limited.
20:20 VirusJTG_ [VirusJTG@Nightstar-6i5vf7.sta.comporium.net] has joined #code
20:21 VirusJTG_ [VirusJTG@Nightstar-6i5vf7.sta.comporium.net] has quit [Connection closed]
20:21 VirusJTG_ [VirusJTG@Nightstar-6i5vf7.sta.comporium.net] has joined #code
20:21
<@pjdelport>
Depends a bit on your perspective: they're also more general, which makes them less limited when you want generality.
20:22
<@pjdelport>
An over-specific abstraction can be as limiting as an over-generic one.
20:24 VirusJTG [VirusJTG@Nightstar-6i5vf7.sta.comporium.net] has quit [Ping timeout: 121 seconds]
20:26 catadroid` is now known as catadroid
20:28 VirusJTG_ [VirusJTG@Nightstar-6i5vf7.sta.comporium.net] has quit [[NS] Quit: Leaving]
20:28 VirusJTG [VirusJTG@Nightstar-6i5vf7.sta.comporium.net] has joined #code
20:28 mode/#code [+ao VirusJTG VirusJTG] by ChanServ
20:28
< catadroid>
My favourite new thing is the abstraction of all container types to reduce
20:42 catadroid [catadroid@Nightstar-2k0p46.dsl.teksavvy.com] has quit [Connection closed]
20:42 catadroid [catadroid@Nightstar-2k0p46.dsl.teksavvy.com] has joined #code
20:45
<@pjdelport>
Foldable?
20:47
<&McMartin>
12:22 <@pjdelport> An over-specific abstraction can be as limiting as an over-generic one.
20:48
<&McMartin>
This is pretty much my problem with Applicative right now; every example I have is either weirdly ad-hoc seeming or obviously a toy example.
20:48
<@pjdelport>
What do you mean?
20:49
<&McMartin>
I understand the description of the general case, but there seems to be some level of design in taking a data type and selecting that *this* semantics is how it will hew to that general case
20:49
<&McMartin>
The end result is something that looks like you have successfully shoehorned a bunch of disparate things into something a single library function can handle
20:49
<&McMartin>
And that's great, but it's more "a neat trick" than "a new way of thinking".
20:50
<&McMartin>
Though I guess the question is how far one's mind flew after first encountering "that's just a map-reduce problem, isn't it"
20:51
<@pjdelport>
Hmm, I'm not sure I follow what you mean.
20:51
<&McMartin>
Because it seems like the Functor/Applicative Functor expansions are in some sense "that's still map-reduce but you're squinting harder"
20:51
<&McMartin>
Basically, right now the only reason I see for the List Monad being implemented the way it is is "because that makes list comprehensions easy to specify"
20:51
<&McMartin>
And that's no small thing, mind you
20:52
<&McMartin>
But it still feels more arbitrary than fmapping over a graph does.
20:52
<@pjdelport>
For me, at last, Applicative is an amazing sweet spot of obviousness versus power: it just lifts the idea of function application to Functor types.
20:52
<@pjdelport>
So if you grok pur function application, you can grok Applicative function application.
20:53
<&McMartin>
Right, and that works for list because that's "apply a list of functions to a list of argument lists - or is that a list of lists of arguments"
20:53
<@pjdelport>
And most importantly, you can take any pure code written in applicative style, and mechanically transform it into the equivalent Applicative code, just by lifting the function applications.
20:53
<@pjdelport>
That's not necessarily obvious at first, but once you understand it, it's amazingly simple and useful.
20:53
<&McMartin>
My question is "So why does that have to be cartesian product"
20:54
<&McMartin>
And the only answer I have is "if it is, a bunch of stuff works out really well"
20:54
<@pjdelport>
Well, that's the thing.
20:54
<@pjdelport>
It doesn't have to be a Cartesian product: it can also be a pairwise zipping.
20:54
<@pjdelport>
Applicative lets you express both.
20:54
<@pjdelport>
(Monad doesn't, because of its additional constraints.)
20:55
<@pjdelport>
The intuition with Functor and Applicative is that the application must be structure-preserving, for whatever "structure" means for your type.
20:56
<@pjdelport>
With lists, "structure" basically means size and order of elements, and the Cartesian product is basically one of two possible ways to preserve that structure from both sides.
20:57
<@pjdelport>
Zipping is the other.
20:57
<&McMartin>
I don't actually consider zipping to qualify, normally, to be fair.
20:57
<&McMartin>
Because zipping typically implies things about the length of the arguments.
21:00
<&McMartin>
Anyway
21:00
<&McMartin>
I think the part where my skepticism is raised is because I grew up next to OO
21:00
<&McMartin>
And I can sense the same sort of emotion that gave us "Hey, wait, objects can be callable, that means we don't have to have functions OMG OMG OMG"
21:01
<&McMartin>
And this is more a sense of "You are buying something by casting things in a certain way but that doesn't mean that there's necessarily a single unifying idea"
21:01
<&McMartin>
In this case, there is, in various modes, and that's "structure-preserving transformation"
21:02
<&McMartin>
Which Monad isn't, but which I don't yet have a punchy grasp of; it seems to be something like "Stepwise-evolving context" but I can't yet cleanly map that to either bind or the do syntax in a way I'm happy with.
21:02
<@pjdelport>
Yup. It turns out that the idea of splitting things into structures/contexts and holes/focuses is an amazingly, amazingly, general and pervasive and useful tool.
21:02
<@pjdelport>
Oh, no, Monad is the same, basically.
21:03
<@pjdelport>
Functor/Applicative/Monad (and Comonad and many other variants) are all united by the single idea of structure preservation.
21:03
<@pjdelport>
They differ only in what things they allow (while being structure preserving.
21:03
<&McMartin>
Mmm
21:04
<&McMartin>
I'm not sure I agree because you've got, say, the Either monad
21:04
<@pjdelport>
Functor just gives you the ability to apply pure functions to the structure's hole/focus.
21:04
<&McMartin>
Not Either. Maybe.
21:04
<&McMartin>
Which is where you are (maybe) annihilating your structure.
21:04
<@pjdelport>
Applicative extends it by adding the ability to do full function application, in a way that is structure-preserving of both sides of the application.
21:05
<@pjdelport>
Monad extends it by adding the ability to collapse two levels of structure, in a way that preserves the structure of both levels (join)
21:05
<@pjdelport>
And so on.
21:05
<&McMartin>
Right
21:06
<@pjdelport>
Every derived class can basically be described as "... adds the ability to do foo in a functorial-structure-preserving way."
21:06
<&McMartin>
So, I think my -_- here is because "it's an amazingly, amazingly general and pervasive and useful tool" makes me want to reply "yeah, and so is objc_send_msg"
21:06
<@pjdelport>
Either and Maybe are both structure-preserving: Left and Nothing are structure.
21:07
<@pjdelport>
McMartin: There's a difference between general and ad hoc.
21:07
<@pjdelport>
objc_send_msg isn't general in any real sense: it doesn't abstract anything. There are no general laws or statements you can give about it.
21:07
<@pjdelport>
It basically tells you and gives you nothing.
21:08
<@pjdelport>
Functor, Applicative, Monad, and the rest give you actual, precise, universal behaviors
21:08
<@pjdelport>
Which you can express as laws, and build entire ecosystems of libraries around.
21:08
<&McMartin>
... I really have no way of phrasing this that doesn't sound patronizing but
21:08
<&McMartin>
You are aware that there's decades of wanking over object-oriented architecture and libraries and stuff, right
21:09
<@pjdelport>
Any new type you can define a Functor/Applicative/Monad for immediately gains access to a huge array of powerful libraries that hook into that general interface.
21:09
<&McMartin>
objc_msg_send is the smallest unit you need to actually build all of that stuff
21:09
<@pjdelport>
And you can't do it without that.
21:09
<@pjdelport>
McMartin: The point is that you still have to build it. You'll still have to go and define the equivalent of Functor / Applicative / Monad / etc.
21:09
<&McMartin>
I suspect the disconnect here is that you're coming at this from the "other direction", as it were
21:10
<@pjdelport>
objc_msg_send itself doesn't give you anything.
21:10
<&McMartin>
Well
21:10
<@pjdelport>
In the same way that pure functions give you anything.
21:10
<&McMartin>
It gives you a specific, controlled use of function pointers
21:10
<&McMartin>
... er, in the same way they do or do not give you anything
21:11
<@pjdelport>
After all, everything in Haskell (including F/A/M and other type classes) is ultimately just syntax sugar for data and functions; but it would be silly to say that function application is all you need.
21:11 * McMartin turns that over, hmms
21:11
<&McMartin>
... I'm not sure that's true, because...
21:11
<@pjdelport>
Function application is the equivalent of objc_msg_send here: you can build anything out of it, and everything is built on top of it.
21:12
<&McMartin>
I don't think F/A/M are syntactic sugar
21:12
<@pjdelport>
But that doesn't make it the abstraction: that's just the primitive that the abstractions are built out of.
21:12
<@pjdelport>
They're just library definitions.
21:12
<&McMartin>
They're an enforcable, generic API
21:12
<&McMartin>
They're "you only *really* have to write this function *once*"
21:13
<@pjdelport>
Yes; that's the whole point of defining a powerful, generic interface.
21:14
<@pjdelport>
But to have that, the interface actually has to be both generic (widely applicable) and powerful (i.e., it comes with strong semantics and meaning that you can build stuff on, without leaving the abstraction)
21:14
<@pjdelport>
That's what F/A/M and friends give you.
21:14
<&McMartin>
Okay
21:14 catadroid [catadroid@Nightstar-2k0p46.dsl.teksavvy.com] has quit [Ping timeout: 121 seconds]
21:15
<&McMartin>
Then we can phrase my initial uneasiness more simply.
21:15
<@pjdelport>
(and unstructured function application or objc_msg_send doesn't, because you don't have any semantics)
21:15
<&McMartin>
"Yes, that's what it's all built on"
21:15
<&McMartin>
"But that's not what you *actually do with it*"
21:16
<&McMartin>
In practice, when I perform list comprehensions, functor applications in a generalized map-reduce context, or ordering of I/O operations, the things I'm thinking about and writing about end up remaining totally independent. The fact that they can be implemented with the same set of core primitives is a mercy for the type checker and a neat quirk of implementation, but it stays hidden
21:16
<&McMartin>
The evidence for this is the number of systems out there that provide fractions of it without doing the unification under the hood
21:17
<&McMartin>
But that could be a basic a statement as "I see the greatness of Functors as not being a great thing about functor *per se* and more the great thing about *typeclasses*"
21:17
<&McMartin>
This is very likely an experience thing
21:18
<&McMartin>
I've done a bit of fiddling with Haskell - mostly solving concrete problems a la Project Euler or "I need an answer to this annoying thing in front of me right here" and modifying other people's applications...
21:18
<@pjdelport>
My experience is the other way around: once I understood Functor and Applicative, it's much harder for me to think in any other terms, because they're so simple and universal to me now.
21:18
<&McMartin>
In short, you're a mathematician and I'm a systems engineer~
21:19
<@pjdelport>
The Functor and Applicative code I write maps pretty much one-to-one to how I think about it.
21:19
<&McMartin>
But yeah, the big thing here is that I haven't worked with systems that actually exploit, say, generic Monad transformers on custom datatypes
21:19
<@pjdelport>
When I'm writing in other contexts, I often frame it in those terms to see the picture more clearly, and then mentally compile it down, so to speak, where necessary.
21:19
<&McMartin>
My experience with things like MapM is "oh, right, this thing that I would organize this way in other languages is spelled MapM in this context"
21:20
<@pjdelport>
I'm not much of a mathematician, though. :)
21:20
<@pjdelport>
See, here I would argue for traverse rather than mapM
21:20
<&McMartin>
Right, but I haven't used traverse
21:20
<&McMartin>
I *have* used MapM
21:21
<&McMartin>
Because what I was doing is spelled foreach elsewhere
21:21
<@pjdelport>
traverse is mapM, just with the correct type :)
21:21
<@pjdelport>
mapM is overly constrained: it doesn't actually need Monad, and it doesn't expose the idea of structure traversal.
21:21
<@pjdelport>
Traversable and traverse does both better.
21:22
<@pjdelport>
(mapM is literally just traverse with its type signature specialised to the wrong type, Monad, because it predates Haskell having the right types)
21:23
<&McMartin>
When you say "the Functor and Applicative code that I write", incidentally, do you mean the part where you're implementing fmap for your new types or the part where you're calling fmap on types designed by other people
21:23
<@pjdelport>
Probably more the latter. You tend to use types more than you implement novel ones.
21:24
<@pjdelport>
I rarely use fmap directly; I tend to use the Applicative operations a lot.
21:24
<&McMartin>
<$> and <*>?
21:24
<@pjdelport>
Yeah.
21:24
<&McMartin>
Okay
21:25
<&McMartin>
Then the position I seem to be in is "F/A/M and friends fit into the same conceptual framework as the C++ STL"
21:25
<&McMartin>
They're a thing that's used to provide the tools I need to use to solve my actual problem
21:26
<&McMartin>
Though what you buy in each case turns out to be very different.
21:26
<@pjdelport>
Right, they're just libraries; but what makes them so great is the specific abstraction they implement.
21:26
<&McMartin>
Due to the path I took *to* Haskell, F/A/M seems to buy me "not having to reformat my structures into something the library can eat, because the library functions already know how to eat everything important"
21:27
<&McMartin>
That's also kind of true of the STL but that's because it's got wildly-recursive macro expansions to implement the basic operations you need on the fly.
21:28
<&McMartin>
When working in, say, Python, something like fmap shows up a whole lot, but I usually try to tweak my design such that Python thinks everything is a list
21:28
<&McMartin>
At which point that simply becomes "map"
21:28
<@pjdelport>
Right, but one thing about F/A/M is that they have a vastly more general idea of "structure" here; it's more computational structure in general than data structures.
21:28
<@pjdelport>
That's why things like typed effects are also "structure" to them.
21:29
<@pjdelport>
There are many intangible or non-data kinds of structure that you can use F/A/M to work with.
21:29
<&McMartin>
And that's when my alarms start going off =)
21:30
<@pjdelport>
Should be less alarms, and more angelic choirs. :)
21:30
<&McMartin>
Had enough of those from the Design Patterns guys already, thanks~
21:31
<&McMartin>
"But these are real, unlike those" turns out to be spectacularly nonpersuasive from the outside
21:31
<@pjdelport>
Seriously, though, why alarms?
21:31
<@pjdelport>
It's an amazingly powerful, simplifying, unifying concept.
21:31
<&McMartin>
But I *would* be interested in seeing Haskell systems that exploit the generality of its core computational systems at the problem-solving rather than the library-implementation level
21:32
<@pjdelport>
It's an abstraction that's very general and very rigorous and very reliable: it's exactly all the things we seek most when it comes to software and code reuse.
21:33
<@pjdelport>
The idea that you get non-data kinds of computational structure shouldn't be alarming: you're already familiar with IO actions, right?
21:33
<&McMartin>
Right
21:33
<&McMartin>
That's part of the alarm, more or less
21:33
<@pjdelport>
For IO actions, "structure" just means "effects"
21:33
<@pjdelport>
So everywhere you see "structure-preserving" in the general descriptions, you can just substitute "effect-preserving" when used with or specialised to IO actions.
21:34
<&McMartin>
That's "WE THINK WE CAN SHOEHORN EVERYTHING INTO OUR BEAUTIFUL CONSTRUCT, AND BECAUSE WE CAN, WE HAVE TO" and this is a pattern I've seen happen all over the place for decades and from where I stand this always ends in tears
21:34
<@pjdelport>
That's it.
21:34
<&McMartin>
This is not a universal result!
21:34
<@pjdelport>
Well, it actually does end up being universal in this case.
21:34
<&McMartin>
But you do realize that these are exactly the claims given by the dynamic languages folks and the OO folks etc etc
21:34
<@pjdelport>
Everything you can prove to obey the laws obey the abstraction, by definition.
21:35
<&McMartin>
And the argument on this is that "this time it's for real" and it has the same - and I don't mean this in a negative sense - mystical overtones that you get from all the other software design disciplines
21:35
<@pjdelport>
All the things that end in tears are the things with no rigor or definition or mathematical laws to back them.
21:35
<@pjdelport>
There's no mystical overtones or opinion here: it's hard math.
21:35
<&McMartin>
That's flatly untrue
21:35
<&McMartin>
Mysticism is the form of religion that relies on internalized experiences
21:36
<&McMartin>
Nobody can tell you about the Matrix, you have to see it for yourself, etc.
21:36
<&McMartin>
ANAICT every major aspect of computer science requires something akin to a mystical experience to actually properly comprehend
21:36
<@pjdelport>
The F/A/M interfaces are all instances of their corresponding mathematical definitions, and they all have mathematical / algebraic laws that they obey.
21:36
<@pjdelport>
And the code that uses them only relies on those laws.
21:36
<@pjdelport>
That's what makes them powerful and reliable abstractions.
21:37
<&McMartin>
Yes, I'm familiar with the guarantees of type theory, and the advantage of abstractions
21:37
<@pjdelport>
"Structure-preserving" isn't something you handwave; it's something you mathematically *prove*.
21:38
<&McMartin>
You are aware that you're strawmanning everyone else there, right
21:39
<@pjdelport>
?
21:39
<&McMartin>
You're conflating two arguments
21:39
<@pjdelport>
This is a well-understood thing; Ii'm just trying to explain.
21:39
<&McMartin>
Right, and in so doing, you're conflating two assertions
21:39
<&McMartin>
One about the benefits of APIs with rigid contracts, generally, and one about how F/A/M is a really great set of APIs for doing certain things
21:39
<@pjdelport>
I only made one assertion: that the F/A/M abstractions are rigorous and mathematically defined.
21:40
<&McMartin>
You are also describing alternatives as handwaving or things that provide nothing
21:40
<&McMartin>
But rigorous, mathematically defined interfaces go back to the days of toggle switches
21:41
<@pjdelport>
I was just going with what you said, when you talked about other ideas with "mystical overtones".
21:41
<@pjdelport>
The point is that abstractions like F/A/M aren't that.
21:41
<@pjdelport>
They're hard, rigorous, and without gray areas.
21:42
<&McMartin>
Right, and you're misreading what I was going for there
21:42
<&McMartin>
They're hard, rigorous, and without gray areas
21:42
<&McMartin>
But to actually use them effectively you have to personally undergo a dramatic experience that utterly revises your method of thinking about how computation and in some cases thought itself works
21:43
<@pjdelport>
I dunno about that. They slot into my way of thinking very naturally.
21:43
<&McMartin>
Then you were previously primed for it
21:43
<@pjdelport>
The only problem in my experience is the amount of terrible tutorials and explanations out there.
21:43
<&McMartin>
Right
21:44
<&McMartin>
They're trying to write down a mystical experience and the defining feature of those is that you can't do that effectively
21:44
<@pjdelport>
No, I think they're simple, if you just actually explain the right thing: structure/context versus slots/focus
21:44
<&McMartin>
You can only guide someone through their own vision quest
21:44
<@pjdelport>
Most people already have some intuitive grasp of that.
21:44
<&McMartin>
The only reason I can follow your argument htere is because I've been experimenting with it myself, so you aren't being as straightforward as you think
21:45
<@pjdelport>
I've taught this to many people in #haskell-beginners, and honestly, most of the time, it's a matter of un-teaching bad info, not teaching anything new.
21:46
<@pjdelport>
Well, I assumed your familiarly by how you talk :)
21:46
<@pjdelport>
So I'm not explaining it as to someone not familiar with it yet.
21:46
<&McMartin>
Right
21:46
<&McMartin>
But the, well, messianic attitude is still coming through
21:46
<&McMartin>
That's what sets off the alarms
21:46
<@pjdelport>
I think you're misreading i.t
21:46
<@pjdelport>
it, even
21:47
<@pjdelport>
Nothing I said is controversial: the fact that e.g. effects are structure is just the way it is.
21:47
<@pjdelport>
That's the whole point of the abstraction.
21:47
<&McMartin>
Sure
21:47
<@pjdelport>
If it sets of warning bells, it just means you're not comfortable with the abstraction or idea yet?
21:47
<&McMartin>
Mmmm, not quite
21:47
<&McMartin>
It means to take their claims of universality with a grain of salt, or perhaps a specific set of challenges
21:47
<@pjdelport>
It's as well-established and foundational as anything can be, in Haskell terms.
21:48
<@pjdelport>
What claims of universality?
21:48
<&McMartin>
Like, for SML, my challenge is "prove that *your* version of functors actually scale to reasonably-sized structures, because man, when I see people use it it looks like modules or really, really inconvenient singletons"
21:48
<~Vornicus>
https://gist.github.com/DUznanski/7f48b1eab0eee7bc69974ac55983fcd8 meanwhile have a smidge of code
21:49
<&McMartin>
So, "obviously", they're doing it wrong, but the examples of excellence in ML books I've read all have the stink of "toy example"
21:49
<&McMartin>
The toy example for monad implementation I have to hand is the one from Learn You A Haskell, and it's actually *fantastic* as one there
21:50
<&McMartin>
But then it gets used by setting up applicative function calls or sequences of do operations
21:50
<&McMartin>
And the end result of that is something I can literally dictate assembler code for
21:50
<@pjdelport>
McMartin: What do you mean about scaling to a size? Functors in general don't have concept of or limitation with regards to structure size.
21:50
<&McMartin>
Which means it's "trivial" from a "doing stuff" standpoint
21:50
<&McMartin>
Design complexity
21:51
<@pjdelport>
(Functors in ML are a different and kinda unrelated thing to Haskell's Functor class, of course)
21:51
<&McMartin>
Like, I have a pretty good grasp of how to have a gigantic system and break it into modules, or to design an object-based architecture for things with objects
21:51
<&McMartin>
(Right; hence the comparison to modules)
21:51
<&McMartin>
My reaction to ML functors is "this is a weird generalization in a space where alternatives have advantages you don't obviously have. What *exactly* do you have in mind here"
21:52
<@pjdelport>
I'm not sure what you mean by "the end result of that is something I can literally dictate assembler code for"
21:52
<&McMartin>
My reaction to Haskell Functors is "yep" and Applicative is "well, OK"
21:52
<@pjdelport>
I do know that a lot of the code in LYAH is terrible, and not good examples of anything.
21:52
<@pjdelport>
A lot of people overuse Monads and do syntax a *lot*
21:52
<&McMartin>
Their example data type is a probabilistic value
21:52
<@pjdelport>
That's a pet peeve of mine: it makes the code hugely more complicated and less general at the same time.
21:53
<@pjdelport>
The equivalent Applicative code is usually lots simpler, and more general and faithful to boot.
21:53
<&McMartin>
Which can be made into monad, when you do x <- [....] basically ends up being "roll a die for this value and do a list comprehension with attached and updated probabilities"
21:53
<&McMartin>
WHich is a sensible thing to do given what Monads represent, AIUI.
21:53
<@pjdelport>
For example, silliness like do x <- foo; y <- bar; return (f x y)
21:53
<@pjdelport>
Instead of just f <$> foo <*> bar
21:54
<&McMartin>
There's a similarly sensible Applicative set of "apply these functions with these probabilities to these probabilistic values"
21:54
<&McMartin>
But that's the "define a type that can be used as a monad" case
21:55
<@pjdelport>
Right. (Remember that Monad is just Applicative + join, so everything about Monad except for join is actually just Applicative)
21:55
<&McMartin>
Sure, sorry.
21:55
<@pjdelport>
Personally, I really dislike do syntax.
21:55
<&McMartin>
Read above as "so that's a sensible use of join; it's the point where dice are rolled"
21:56
<&McMartin>
Or a point of nondeterminism, perhaps
21:56
<&McMartin>
But anyway
21:56
<@pjdelport>
I usually very rarely use it, except when writing a very imperative-style main block
21:56
<&McMartin>
I still haven't encountered the thing that works well for you, like, at all
21:56
<@pjdelport>
Other than that, Applicative is almost always simpler, shorter, and clearer.
21:57
<@pjdelport>
And much closer to pure (non-Applicative) Haskell code.
21:57
<&McMartin>
Basically, what I haven't seen is use of the standard (and obviously extensive) library support for Applicative/Monad to do something straightforward
21:57
<&McMartin>
Unless you literally mean <$> and <*> at judicious points
21:57
<@pjdelport>
Oh, all the time.
21:57
<@pjdelport>
Want to read 5 lines? replicateM 5 getLine
21:57
<&McMartin>
... which I *also* haven't seen because everything's either a toy example or HEY LOOK WE CAN COMPENSATE FOR NON-IMPERATIVE CODE
21:58
<@pjdelport>
replicateM should really be replicateA, by the way; again, historical reasons.
21:59
<@pjdelport>
The point is that there are tons of libraries and tooling out there that are entirely generic across any F/A/M type.
22:01
<@pjdelport>
Take http://hackage.haskell.org/package/monad-loops-0.4.3/docs/Control-Monad-Loops.ht ml for example.
22:01
<@pjdelport>
That's an entire reusable library of generic control patterns.
22:02
<@pjdelport>
Without the abstraction of Monad and friends, you can't express those generically: you end up re-writing them for each context from scratch.
22:03
<&McMartin>
Still working on articulating the question right
22:04
<&McMartin>
You're being helpful, but the way I want to phrase this leads to more answers like this than what I actually want to ask
22:05
<@pjdelport>
Here's a random example from a toy problem a while back; I know you're not really interested in toy problem, but it's an example of a non-trivial thing built on the Monad abstraction that's greatly useful to solving an actual problem.
22:05
<@pjdelport>
This is the problem statement: https://github.com/pjdelport/Code-Off/blob/haskell-solutions/code_off-2/code_off -2.md
22:05 Shady [ShadyGuru@Nightstar-qfckjl.tv13.ptd.net] has left #code ["Yay, he's gone"]
22:05
<&McMartin>
My next cut was "What is the thing that would make me feel like I did when I was 12 and first realized the implications of being able to pass comparator functions to a sort routine"
22:05
<@pjdelport>
Here's the Haskell solution: https://github.com/pjdelport/Code-Off/blob/haskell-solutions/code_off-2/CodeOff2 .hs#L73
22:06
<&McMartin>
This is less toy than I meant by "toy" above
22:06
<@pjdelport>
The Haskell solution may or may not be a bit opaque, depending on familiarity, but that one single line is basically the whole solution.
22:06
<@pjdelport>
The rest is just defining the terms of the problem.
22:07
<&McMartin>
To make sure that I understand the broad strokes here, the monad that's being deployed here is List, yes?
22:08
<&McMartin>
Everything else is values?
22:08
<@pjdelport>
Right. (Well, so is list :)
22:08
<&McMartin>
Everything else is the actual data being processed, then =P
22:09
<@pjdelport>
Right, but so is list; there's nothing less "actual" about it.
22:10
<&McMartin>
I'm unwilling to debate the philosophical point especially since I'm 2 hours late for lunch.
22:10
<@pjdelport>
The broad stroke is that you want to fill a sequence of jars from a reservoir until the reservoir runs out.
22:10
<@pjdelport>
That's a trivial use of mapAccumL, if you just fill them in one way.
22:11
<@pjdelport>
(mapping while carrying an accumulator value across)
22:11
<@pjdelport>
But, what if you can fill a jar in more than one way?
22:12
<&McMartin>
Okay
22:12
<@pjdelport>
You just make the filling step return a list of fillings instead of a single filling, and swap in mapAccumM to take advantage of that.
22:12
<@pjdelport>
The solution literally just falls out.
22:12
<&McMartin>
What I'm getting from this is that this is still the kind of thing where it is conceptually closer to C++'s STL than it is to, say, Lisp's macros.
22:13
<@pjdelport>
I can't even express the detailed backtracking manually without a lot of headache; it's extremely complicated and low-level to me.
22:13
<&McMartin>
It's an extremely efficient way of specifying a broad class of algorithm
22:13
<@pjdelport>
But I can look at the problem statement and see that solution above basically immediately and instantly, from the vantage point that monads give me.
22:14
<@pjdelport>
And it turns out the solution algorithm is already implemented, in a completely generic way, in the form of the mapAccumM library function.
22:14
<&McMartin>
Right
22:14
<@pjdelport>
The problem wasn't written with monads in mind, but the solution was already there, in the generic monad libraries.
22:15
<&McMartin>
As opposed to, say, "Someone has embarked on a large Haskell project. They will begin by defining the core monads that will embody the computation"
22:15
<&McMartin>
The way one would for Lisp macros, since Lisp methodology appears to be "use Lisp macros to define a new language just for this project and write your project in that"
22:16
<&McMartin>
If the big sell here is "the standard library lets me plug in specifics I consider trivial and then everything I care about Just Works" that's a *very easy* pitch to understand.
22:19
<@pjdelport>
Pretty much. Lisp macros mostly actually just correspond to libraries in Haskell, by the way.
22:20
<@pjdelport>
Because Haskell uses non-strict evaluation and explicit effects, there's no need for syntactic macros to control order of evaluation.
22:20
<&McMartin>
As someone who went pretty deep into compilers, I actually care very deeply about the difference between source transformation and linking ;-)
22:20
<@pjdelport>
So the vast majority of Lisp macros are literally just equivalent to libraries in Haskell.
22:21
<&McMartin>
Well. Macro packages
22:21
<&McMartin>
Any given macro is roughly equivalent to a function, I'd say.
22:21
<@pjdelport>
(There's a relative minority of Lisp macros that do more fundamental than just things like control and evalution order; those correspond to Template Haskell, basically.)
22:22
<&McMartin>
Those are, I suspect, the things I had in mind
22:22
<&McMartin>
Though you can get pretty close to a DSL from Haskell alone just through careful token selection (hi, Parsec)
22:23
<@pjdelport>
Haskell itself tends to be the DSL :)
22:24
<&McMartin>
I'd say it more embodies the argument that you shouldn't *need* a DSL in the first place
22:26
<@pjdelport>
Well, you never *need* one.
22:26
<@pjdelport>
If you like making things difficult for yourself. :)
22:30 Reiv [NSwebIRC@Nightstar-q8avec.kinect.net.nz] has joined #code
22:30 mode/#code [+o Reiv] by ChanServ
--- Log closed Wed Jun 29 00:00:55 2016
code logs -> 2016 -> Tue, 28 Jun 2016< code.20160627.log - code.20160629.log >

[ Latest log file ]