code logs -> 2009 -> Fri, 27 Mar 2009< code.20090326.log - code.20090328.log >
--- Log opened Fri Mar 27 00:00:02 2009
00:03 AnnoDomini [~farkoff@Nightstar-29015.neoplus.adsl.tpnet.pl] has quit [Quit: "General! Strike me down, then have every last man, woman and child put to the sword. There's no point going on anymore, not now that it's all gone..." "At once, sire."]
00:24 Derakon is now known as Derakon[AFK]
00:42 Vornicus [Vornicus@Admin.Nightstar.Net] has joined #code
00:42 mode/#code [+o Vornicus] by ChanServ
00:58 Reiv [~82d94c4d@Nightstar-29731.dsl.in-addr.zen.co.uk] has joined #Code
01:07 Reiv [~82d94c4d@Nightstar-29731.dsl.in-addr.zen.co.uk] has quit [Quit: classtiem]
01:23 Consul [~consul@Nightstar-634.dsl.sfldmi.ameritech.net] has quit [Client exited]
01:25 Consul [~consul@Nightstar-634.dsl.sfldmi.ameritech.net] has joined #code
01:25 mode/#code [+o Consul] by ChanServ
02:04 Reiv [~82d94c4d@Nightstar-29731.dsl.in-addr.zen.co.uk] has joined #Code
02:06
< Reiv>
egad
02:06
< Reiv>
Trees. In Haskell.
02:06
< Reiv>
We shall... see how this turns out...
02:07
<@McMartin>
That's basically what the FiniteMap class is.
02:07
<@McMartin>
Albeit with autobalancers and other such fun things
02:10
<@Vornicus>
Unless of course you want less obvious trees.
02:18
<@McMartin>
If it's the ones in the YAHT, they're just generic tree datastructures to show what a, well, tree-recursive datatype looks like
02:18
<@McMartin>
Yeah, home sounds good.
02:18 * McMartin vanishes.
02:23
<@McMartin>
Oh, but, before I go, at Reiv, three implementations of Factorial.
02:23
<@McMartin>
Because I forgot to post it before.
02:23
<@McMartin>
http://paste.ubuntu.com/138667/
02:24
<@McMartin>
(Ignore fact3 for now)
02:26
< Reiv>
The first is most intuitive
02:27
< Reiv>
The second one hurts my brain a little; care to explain it later?
02:27
<@McMartin>
Yes
02:27
<@McMartin>
But not now, as it's time for me to go home.
02:27
<@McMartin>
The second is more efficient for reasons I'll get into then.
02:27
< Reiv>
Danke. :)
02:27
< Reiv>
Bye!
02:27
<@McMartin>
The third is a demonstration of why currying is awesome.
02:28
<@McMartin>
(The second is also the Hint for how to do the classical Reverse function.)
02:39
< Reiv>
... uh, Vornicus?
02:40
< Reiv>
Why is my internet telling me that the domain name vorn.dyndns.org no longer exists?
02:43
< Reiv>
And on an unrelated note - conceptual database models are confusing.
02:43
< Reiv>
My instant instinct is to do the physical model, but that's Too Soon apparently.
02:48 Finale [c0cb88fd@Nightstar-14595.mibbit.com] has quit [Quit: http://www.mibbit.com ajax IRC Client]
02:51
<@McMartin>
OK, so, do you want me to talk through what f2' and f2 do before I explain why you'd do it that way?
02:53
< Reiv>
Yes please.
02:53
< Reiv>
Also why you refer to it as f2' would be nice.
02:53
< Reiv>
I may have missed a spot of syntax there.
02:54
<@McMartin>
That part's easy.
02:54
<@McMartin>
The ' is allowed as parts of names.
02:54
<@McMartin>
It should be read aloud as "prime".
02:54
<@McMartin>
If you wish, pretend it is named "fact2_aux"
02:55
< Reiv>
aha
02:55
<@McMartin>
At any rate. f2 r x. x is the value we're computing the factorial for, and r is the "result so far."
02:55
<@McMartin>
If x is the 0, the base case, we're done and the result so far is, in fact, the result.
02:55
<@McMartin>
That's line 1.
02:56
<@McMartin>
Line 2 says that given the result so far and something that isn't the base case, we recurse with an updated result so far (multiplying x in) and a smaller value to compute (x-1).
02:56
<@McMartin>
Line 3 says that the actual factorial function is this, but the initial result-so-far is 1.
02:56
<@McMartin>
Essentially, you're starting with the base case here instead of ending with it.
02:57
<@McMartin>
As for why you would do this, there are two reasons.
02:57
< Reiv>
Which means it doesn't have to calculate all the way to the end before it can start collapsing the recursion?
02:57
<@McMartin>
Stronger than that, actually; at the end of the day, it doesn't end up recursing at all.
02:57
<@McMartin>
Before that, though, observe the sequence once the base case is hit.
02:58
<@McMartin>
In f1, it's "return base value, do a multiply, return, do a multiply, return, do a multiply, return..."
02:58
<@McMartin>
in f2, it's "return, return, return, return, return, return..."
02:58
< Reiv>
... ah-hah.
02:58
<@McMartin>
This is because the last thing the function does is evaluate a function call. These are called "tail calls", and if it's recursive, the function is said to be "tail recursive".
02:59
<@McMartin>
There's an optimization associated with this that can interpreted in several ways, that basically involves collapsing that chain of returns into a single one.
02:59
<@McMartin>
You are either:
02:59
<@McMartin>
- returning from the calling function in some sense "before" actually making its final call, thus taking its place on the call stack, or...
02:59
<@McMartin>
- ... replacing a function call with what is essentially a GOTO.
03:00
<@McMartin>
What this works out to is that when you're doing the computation of, say, 3, with f1 your call stack will look like this:
03:00
<@McMartin>
f1 3
03:00
<@McMartin>
f1 3 : f1 2
03:00
<@McMartin>
f1 3 : f1 2 : f1 1
03:00
<@McMartin>
And then you'll start returning
03:00
<@McMartin>
While with f2, it will be
03:00
<@McMartin>
f2 3
03:00
<@McMartin>
f2' 1 3
03:01
<@McMartin>
f2' 3 2
03:01
<@McMartin>
f2' 6 1
03:01
<@McMartin>
f2' 6 0
03:01
<@McMartin>
<done>
03:01
< Reiv>
Cunning!
03:01
<@McMartin>
(A side effect of this is that if you get into an infinite loop, it's far less likely to detect it.)
03:01
<@McMartin>
The other thing it does isn't obvious from factorial, but it changes where the parentheses are in the intermediate operations.
03:02
<@McMartin>
Multiplication is associative so there's no effect.
03:02
<@McMartin>
But suppose we did it with map...
03:02
<@McMartin>
(preparing paste)
03:06
<@McMartin>
http://paste.ubuntu.com/138685/
03:06
<@McMartin>
See if you can see how those will behave differently without running them.
03:06
<@McMartin>
If you can't, try running them with, say...
03:06 * Reiv eyes.
03:06
<@McMartin>
rmap (*2) [1..10]
03:07
<@McMartin>
map, you will recall, takes a function of one argument, and a list, and gives you a new list with that function applied to each argument in turn.
03:08
< Reiv>
... the second one can't do anything until the whole list has been traversed. The first one does it peicemeal.
03:08
< Reiv>
Yes?
03:09
<@McMartin>
There's a rather more dramatic difference, but it's true that if you pass rmap an infinite list you will get an infinite list back, while if you pass imap an infinite list it will hang.
03:09
<@McMartin>
(In this, these implementations match foldr (rmap) and foldl (imap))
03:11
<@McMartin>
(And those parallels are not coincidental, given the drastic difference in those implementations.)
03:11
<@McMartin>
Remember, : is not associative.
03:11
< Reiv>
... aha?
03:11
<@McMartin>
Have you tried it?
03:12
<@McMartin>
But to back up my claim that it's not...
03:12
<@McMartin>
In (a:(b:c)), a and b are both the same type (t), and c is a list of t.
03:13
<@McMartin>
In ((a:b):c), if a is some type t, then b is a list of t, and c is a list of lists of t.
03:15
< Reiv>
... ah, yes
03:15
< Reiv>
Sorry, brainfart for a moment there. I follow now
03:17
<@McMartin>
Now, look what happens when we imap (2*) to [1, 2, 3].
03:17
<@McMartin>
imap (2*) [1, 2, 3] -> imap' (2*) [] [1, 2, 3] is the initial setup
03:17
<@McMartin>
Then, we pull the first element off, process it, and put it on our result.
03:18
<@McMartin>
-> imap' (2*) [2] [2, 3]
03:18
<@McMartin>
Then, again.
03:18
<@McMartin>
-> imap' (2*) [4, 2] [3]
03:18
<@McMartin>
Then, again.
03:18
<@McMartin>
-> imap' (2*) [6, 4, 2]
03:18
<@McMartin>
And now we've finished, and the result is [6, 4, 2].
03:18
<@McMartin>
Which is wrong.
03:18
<@McMartin>
But.
03:18
< Reiv>
... wait, that just went backwards
03:18
< Reiv>
yeah
03:18
<@McMartin>
Now you should know how to write reverse. ;-)
03:19 * Reiv snickers.
03:19
< Reiv>
I'll go home and give it a shot there. Sound good?
03:19
<@McMartin>
(:) prepends elements, and since imap is processing one element at a time, it builds it up backwards.
03:19
<@McMartin>
Sure
03:19 * Reiv also really appreciates this stuff. It's helping gel quite a few concepts he was fuzzy on.
03:19
< Reiv>
(I am also Really Liking Haskell for learning the conceptual stuff, idly. It seems to mesh with my brain. Go figure.)
03:20
<@McMartin>
f3 is a variant of f2 that's really there to demonstrate currying, which isn't going to work out so well without a terp handy.
03:20
<@McMartin>
But if you're still stuck at work, I can at least explain the trick.
03:21
< Reiv>
What baffles me on f3 is how it goes about getting the desired value to start with
03:22
<@McMartin>
Yeah. So, let's look at an easier function.
03:22
<@McMartin>
f x y = [x, y]
03:22
<@McMartin>
One would think this takes two arguments of some type t and returns a list of them.
03:22
<@McMartin>
Say, t t -> [t], in Haskell speak.
03:23
<@McMartin>
If you actually ask for the type, you'll get this instead.
03:23
<@McMartin>
Prelude> let f x y = [x, y]
03:23
<@McMartin>
Prelude> :t f
03:23
<@McMartin>
f :: t -> t -> [t]
03:23
<@McMartin>
As far as Haskell is concerned, all functions only have one argument.
03:23
<@McMartin>
If you think you have more, one of two things is happening
03:23
<@McMartin>
Either (a) you have a tuple, and it's (t, t) -> [t]. That's not what's happening here.
03:23
<@McMartin>
Or...
03:24
<@McMartin>
... you have function f, which takes a parameter x, which returns another function that takes a parameter y and makes a list out of x and y.
03:24
<@McMartin>
We can stop this at basically any point, as we can see from this...
03:25
<@McMartin>
Prelude> f 2 3
03:25
<@McMartin>
[2,3]
03:25
<@McMartin>
Prelude> let g = f 1
03:25
<@McMartin>
Prelude> g 7
03:25
<@McMartin>
[1,7]
03:25
<@McMartin>
So, because this is how it's working, and since the base case is the first argument in f2', f3 can just provide that argument and take the partially evaluated function returned thereby and take that as its own value.
03:26
<@McMartin>
When you combine this with function composition and application operators you can take this to utterly absurd extremes.
03:27
< Reiv>
... Aaaah.
03:27
< Reiv>
Cunning!
03:27
<@McMartin>
This technique is called "currying" after the early 20th-century mathematician Haskell Curry.
03:28
<@McMartin>
In any case.
03:28
<@McMartin>
f2 and f3 are generally more efficient than f1 even if, like imap, you have to reverse the final result.
03:29
<@McMartin>
This is because each frame on the call stack eats up some space, and since tail calls mean that the stack never grows, you're much closer to what would be a for loop in a language like Java.
03:29
<@McMartin>
While the non-tail calls are more like recursion in Java.
03:29
< Reiv>
Aha. Okay.
03:30
<@McMartin>
(And in fact, the tail calls effectively turn into a "replace the stack frame and do a goto", which means that it produces more or less identical assembler code to something that's reassigning some local variables inside a for loop.)
03:31
<@McMartin>
There's also the general issue that having these extra functions is kind of dumping on your namespace.
03:31
<@McMartin>
Happily, you can define functions inside other functions.
03:31
< Reiv>
I do like that aspect of Haskell.
03:32
<@McMartin>
The way I'd actually write imap is like this:
03:32
<@McMartin>
imap = aux [] where aux r f [] = r aux r f (h:t) = aux ((f h):r) f t
03:32
<@McMartin>
Er
03:32
<@McMartin>
Except with working newlines. WTG, emacs.
03:32 * Reiv laughs.
03:32
< Reiv>
Okay, I'm off home now.
03:32
< Reiv>
Cheers McM
03:32
<@McMartin>
There's also a "let" construct that's closer to what LISP and ML have.
03:32
< Reiv>
(You're having fun with this, aren't you?)
03:32
<@McMartin>
Keeps my hand in~
03:33
<@McMartin>
Professionally I work with crawling horrors of C++ codebases
03:33
< Reiv>
So it's fun to work with languages not made of lovecraftian madness?
03:34
<@McMartin>
Basically~
03:34
< Reiv>
Excelent~
03:34 Reiv [~82d94c4d@Nightstar-29731.dsl.in-addr.zen.co.uk] has quit [Quit: hometiem; replies get picked up via backscroll.]
04:29
<@Reiver>
So, I was pondering functional programming and game design on my way home.
04:31
<@Reiver>
In Object Oriented Programming, if you want to have, say, a Planet and a Spaceship, you'd create objects that had all their features accordingly, and All would be Well. You could also do something similar in a database structure, if it proved more efficient.
04:31
<@Reiver>
In Haskell I start to get the impression you could create a custom data type that stored the information, but that seems... messy.
04:42
<@McMartin>
Typeclasses are... different.
04:42
<@McMartin>
That said, the usual way things are done is that you have a datatype and procedures that manipulate that datatype.
04:43
<@McMartin>
OO's primary mechanism involves attaching those procedures in some sense directly to the object.
04:43
<@Reiver>
right
04:44
<@McMartin>
This is functionally identical to having one extra argument to each such method.
04:45
<@McMartin>
Haskell's approach to this is that depending on how you use an argument to a function, it will work out which interfaces said argument must implement.
04:45
<@McMartin>
If you try to get the type of \x y -> x + y, for instance, you'll get something like "(Num a) => a -> a -> a"
04:46 somnolence [~somnolenc@203.160.1.ns-3171] has quit [Client exited]
04:47
<@Reiver>
Ah... hah.
04:56
<@McMartin>
The tutorials will help a little more with that.
04:56
<@McMartin>
Haskell does this different from everyone else.
04:57
<@Reiver>
Is this a good thing, a bad thing, or a Dangerous thing?
04:57
<@McMartin>
It's a thing
05:02 Syloqs-AFH [Syloq@Admin.Nightstar.Net] has quit [Connection reset by peer]
05:03 somnolence [~somnolenc@203.160.1.ns-3171] has joined #code
05:10 Derakon[AFK] is now known as Derakon
05:20
<@ToxicFrog>
Sanity check: the problem of writing an NP algorithm to find a hamiltonian circuit can be re-expressed as the problem of writing a P algorithm to verify whether a given circuit is hamiltonian, and assuming this is used by the nondeterministic machine to come up with a valid circuit.
05:21
<@Derakon>
I...think so?
05:21
<@Derakon>
It's been a long time since I did P/NP theory, though.
05:21
<@ToxicFrog>
(context: "design a nondeterministic algorithm to check if a graph, represented as an adjancency matrix, is hamiltonian in polynomial time.")
05:22
<@Derakon>
...nondeterministic algorithm?
05:23
<@ToxicFrog>
Yes.
05:23
<@Derakon>
So, what, it invokes rand()?
05:24
<@ToxicFrog>
This is described in the text as "an algorithm consisting of a verifier, which is deterministic and answers whether a given solution is valid in polynomial time; and a generator, which feeds possible solutions to the verifier"
05:24
<@ToxicFrog>
The idea being that problems in NP can be solved in PTIME by this arrangement, assuming the generator is magical.
05:25
<@ToxicFrog>
This is why they're in NP.
05:25
<@Derakon>
I'm reminded of the "I can find the maximum value in an unsorted list in constant time, as long as you don't mind me being wrong X% of the time."
05:25
<@ToxicFrog>
Heh.
05:25
<@Derakon>
Said algorithm consists of randomly indexing into the array 1000 times (or whatever) and taking the biggest result.
05:25
<@ToxicFrog>
I figured.
05:26
<@ToxicFrog>
Anyways, AIUI, the distinction between P and NP is that problems in P can be solved deterministically in polynomial time; problems in NP can be verified in these conditions, but generating the right answer in the first place requires a nondeterministic machine in the same sense that an NFA is nondeterministic: it "just knows" what the right path to take is.
05:27
<@ToxicFrog>
Generating the right answer in PTIME, that is.
05:28 * Derakon nods.
05:28
<@Derakon>
That sounds about like I remember.
05:29
<@ToxicFrog>
Thus, an NP algorithm consists of the detpol verification algorithm, and the nondetpol generator, the latter of which can be considered a version of rand() that always gives you the right answer.
05:29 * Derakon ponders.
05:29
<@ToxicFrog>
So all I actually need to write here is the verification algorithm.
05:30
<@Derakon>
I'd like to add pools to mapgen next. My main issue is coming up with a method for doing so that doesn't amount to "try to add a pool, and restart if it gets too big."
05:30
<@Derakon>
And that also isn't "only put pools in downward-traveling terminal nodes".
05:57 Derakon is now known as Derakon[AFK]
05:58 Derakon[AFK] [~Derakon@Nightstar-4879.hsd1.ca.comcast.net] has quit [Quit: This computer has gone to sleep]
06:29
<@ToxicFrog>
Oh, how lazy of my prof.
06:29
<@Vornicus>
??
06:29
<@ToxicFrog>
"Here is an instance of the knapsack problem. Solve it with branch-and-bound."
06:29
<@ToxicFrog>
My B&B is rusty, so I crack open the textbook.
06:29
<@ToxicFrog>
The problem he gave is identical to the worked example in the text.
06:29
<@Vornicus>
Pfffffff
06:30
<@ToxicFrog>
(specifically v=[40, 42, 25, 12], w=[4, 7, 5, 3], W=10)
06:30
<@McMartin>
Whoops.
06:30
<@Vornicus>
what are v, w, and W?
06:31
<@ToxicFrog>
values, weights, total capacity.
06:31
<@Vornicus>
as soon as I asked I figured it out, yes.
06:31
<@ToxicFrog>
I'm going to solve it anyways, because I need the practice, but how could he tell?
06:32
<@Vornicus>
for this do you get more than one of each v/w pair?
06:32
<@ToxicFrog>
No.
06:32
<@ToxicFrog>
Each item is unique, so you cannot, for example, put in two instances of item 1 and claim a value of 80 (with two weight units left over!)
06:32 * Vornicus makes out at 65.
06:32
<@ToxicFrog>
Yep.
06:33
<@ToxicFrog>
Using items 1 and 3.
06:34
<@ToxicFrog>
(the full problem is in fact "solve using bottom-up dynamic programming; solve using greedy-value, greedy-weight, and greedy-v/w, then prove that none of these can guarantee optimality; solve using branch-and-bound)
06:37 Vornicus [Vornicus@Admin.Nightstar.Net] has quit [Quit: ]
06:46
<@ToxicFrog>
Done.
06:47
<@ToxicFrog>
Let's see, for this assignment, I had to do...Warshall, Floyd-Warshall, backtrack, Dijkstra, bottom-up DP, greedy (x3), branch-and-bound, and Huffman.
06:48
<@ToxicFrog>
By hand. >.<
08:06 AnnoDomini [~farkoff@Nightstar-29015.neoplus.adsl.tpnet.pl] has joined #Code
08:06 mode/#code [+o AnnoDomini] by ChanServ
08:48 Rhamphoryncus [~rhamph@Nightstar-7184.ed.shawcable.net] has quit [Quit: Rhamphoryncus]
08:52 Attilla [~The.Attil@Nightstar-9147.cdif.cable.ntl.com] has joined #code
08:52 mode/#code [+o Attilla] by ChanServ
10:52 Attilla [~The.Attil@Nightstar-9147.cdif.cable.ntl.com] has quit [Ping Timeout]
10:54 Attilla [~The.Attil@Nightstar-9147.cdif.cable.ntl.com] has joined #code
10:54 mode/#code [+o Attilla] by ChanServ
11:04 Attilla [~The.Attil@Nightstar-9147.cdif.cable.ntl.com] has quit [Connection reset by peer]
11:07 Attilla [~The.Attil@Nightstar-9147.cdif.cable.ntl.com] has joined #code
11:07 mode/#code [+o Attilla] by ChanServ
11:30 gnolam [lenin@Nightstar-1382.A163.priv.bahnhof.se] has joined #Code
11:30 mode/#code [+o gnolam] by ChanServ
11:43 KarmaBot [AnnoDomini@Nightstar-29015.neoplus.adsl.tpnet.pl] has quit [Connection reset by peer]
11:43 KBot [AnnoDomini@Nightstar-29015.neoplus.adsl.tpnet.pl] has joined #Code
11:46 KBot is now known as KarmaBot
12:58 Tarinaky [~Tarinaky@88.83.110.ns-10776] has quit [Client exited]
13:54 Alek [~omegaboot@Nightstar-3659.dsl.emhril.sbcglobal.net] has quit [Quit: ]
14:24 Alek [~omegaboot@Nightstar-3659.dsl.emhril.sbcglobal.net] has joined #code
15:03 Syloq [Syloq@Admin.Nightstar.Net] has joined #code
15:04 Syloq is now known as Syloqs-AFH
15:15 KBot [AnnoDomini@Nightstar-29064.neoplus.adsl.tpnet.pl] has joined #Code
15:16 KarmaBot [AnnoDomini@Nightstar-29015.neoplus.adsl.tpnet.pl] has quit [Ping Timeout]
15:16 AnnoDomini [~farkoff@Nightstar-29015.neoplus.adsl.tpnet.pl] has quit [Ping Timeout]
15:18 KBot is now known as KarmaBot
15:23 AnnoDomini [~farkoff@Nightstar-29064.neoplus.adsl.tpnet.pl] has joined #Code
15:23 mode/#code [+o AnnoDomini] by ChanServ
15:26 Attilla [~The.Attil@Nightstar-9147.cdif.cable.ntl.com] has quit [Connection reset by peer]
15:26 Attilla [~The.Attil@Nightstar-9147.cdif.cable.ntl.com] has joined #code
15:26 mode/#code [+o Attilla] by ChanServ
15:38 Vornicus [Vornicus@Admin.Nightstar.Net] has joined #code
15:38 mode/#code [+o Vornicus] by ChanServ
15:40 Vornicus [Vornicus@Admin.Nightstar.Net] has quit [Connection reset by peer]
15:47 Vornicus [Vornicus@Admin.Nightstar.Net] has joined #code
15:47 mode/#code [+o Vornicus] by ChanServ
16:14 Derakon [~Derakon@Nightstar-4879.hsd1.ca.comcast.net] has joined #code
16:15 mode/#code [+o Derakon] by ChanServ
16:57 ToxicFrog [~ToxicFrog@Admin.Nightstar.Net] has quit [Operation timed out]
17:04 Rhamphoryncus [~rhamph@Nightstar-7184.ed.shawcable.net] has joined #code
17:04
<@Derakon>
Python's GC handles circular references, right?
17:05
< EvilDarkLord>
I think I recall seeing a mention of yes, but I'm not certain.
17:07 * EvilDarkLord happies at Matlab's extensive function list.
17:16 C_tiger [~cheng@Nightstar-5625.hsd1.ca.comcast.net] has quit [Ping Timeout]
17:20 C_tiger [~cheng@Nightstar-5625.hsd1.ca.comcast.net] has joined #code
17:20 mode/#code [+o C_tiger] by ChanServ
17:24 C_tiger [~cheng@Nightstar-5625.hsd1.ca.comcast.net] has quit [Ping Timeout]
17:27 C_tiger [~cheng@Nightstar-5625.hsd1.ca.comcast.net] has joined #code
17:27 mode/#code [+o C_tiger] by ChanServ
18:01 Tarinaky [~Tarinaky@Nightstar-16638.plus.com] has joined #code
18:06 Attilla [~The.Attil@Nightstar-9147.cdif.cable.ntl.com] has quit [Connection reset by peer]
18:07 Attilla [~The.Attil@Nightstar-9147.cdif.cable.ntl.com] has joined #code
18:07 mode/#code [+o Attilla] by ChanServ
18:07 Attilla [~The.Attil@Nightstar-9147.cdif.cable.ntl.com] has quit [Connection reset by peer]
18:08 Attilla [~The.Attil@Nightstar-9147.cdif.cable.ntl.com] has joined #code
18:08 mode/#code [+o Attilla] by ChanServ
18:12
<@Derakon>
...what the hell?
18:12
<@Derakon>
I just added a bounding box check to my lineLineIntersect function with the goal of speeding it up.
18:13
<@Derakon>
And it's faster, all right...but it's making totally different maps!
18:14
<@Derakon>
Guess that means there's a bug in my line-line intersect function which is causing it to return true when it ordinarily shouldn't...
18:14
<@Derakon>
Which is weird, because I've not noticed any intersections happening.
18:32 * Derakon does some performance testing and listens to his laptop fan spin up.
18:36
<@Derakon>
Right, that's about a 1% improvement in mapgen time. ?.?
18:39
<@gnolam>
...
18:39
<@gnolam>
Doesn't Python have a profiler? :)
18:39
<@Derakon>
It does, yes.
18:40
<@Derakon>
Which is how I knew that improving that function could possibly be worthwhile in the first place.
18:40
<@Derakon>
I'm reducing calls to pointLineDistance() by doing an extended bounding box check beforehand. pointLineDistance() itself is not particularly complicated, though, so the gain is pretty marginal.
18:41
<@Derakon>
I suppose a better way to handle this would be a quadtree or something similar, since the lines don't move once added to the map.
18:49
<@Derakon>
And *that* just made things three times worse. >.<
18:53 * gnolam whacks MSVS over the head.
19:13
<@Derakon>
Hrm. On second thought, caching which grid cells belong to a given sector is going to be a general pain. :\
19:29
<@C_tiger>
how do I grep files in a folder recursively again?
19:30
<@Derakon>
grep -r
19:31
<@C_tiger>
I'm le-retarded. Here I am trying to pipe ls into grep
19:31 * Derakon snerks.
19:31
<@Derakon>
xargs
19:31
<@Derakon>
ls | xargs grep -r
19:32
<@Derakon>
Hrm...actually, that may not work.
19:32
<@C_tiger>
doesn't matter if grep -r works, I'm happy.
19:32
<@Derakon>
Fair enough.
19:52
<@McMartin>
find is generally what you want for that.
19:53
<@McMartin>
The incantation I use most frequently is "find -type f -not -path \*.svn\* -print0 | xargs -0 grep ..."
19:53
<@C_tiger>
hmmm... I've never used find. grep was always so easy.
19:53
<@McMartin>
find instead of ls, not instead of grep
19:53
<@C_tiger>
oh.
19:53
<@C_tiger>
smartness.
19:55
<@McMartin>
The -print0/-0 thing is so that xargs isn't retarded about files with spaces in their names
19:55
<@McMartin>
The "-not -path" stuff is so it doesn't search all my version control metadata.
19:55
<@gnolam>
Pfft. And who says Unix isn't user friendly? ;)
19:57
<@McMartin>
Well, in the -not -path case, it is; find basically lets you arbitrarily chain query filters.
20:07
<@Doctor_Nick>
MONSTERS VS ALIENS!!!!!!!!!!
20:37 Tarinaky [~Tarinaky@Nightstar-16638.plus.com] has quit [Client exited]
20:51 ToxicFrog [~ToxicFrog@Admin.Nightstar.Net] has joined #code
20:51 mode/#code [+o ToxicFrog] by ChanServ
20:57 Vornicus [Vornicus@Admin.Nightstar.Net] has quit [Ping Timeout]
21:03 Vornicus [Vornicus@Admin.Nightstar.Net] has joined #code
21:03 mode/#code [+o Vornicus] by ChanServ
21:23
<@Derakon>
Man, the mapgen code has hit 1174 lines.
21:27 Attilla [~The.Attil@Nightstar-9147.cdif.cable.ntl.com] has quit [Ping Timeout]
21:27 Attilla [~The.Attil@Nightstar-9147.cdif.cable.ntl.com] has joined #code
21:27 mode/#code [+o Attilla] by ChanServ
21:38 * Derakon ponders pools.
21:38
<@gnolam>
This is my ool. Notice there's no p in it.
21:38
<@Derakon>
I don't think I can have pools impact your overall mobility. :\
21:39
<@Derakon>
Because I don't have a good way to control which tunnels get pools, given the way fluids have to have a constant waterline in the absence of airlocks.
21:43
<@Derakon>
Or alternatively, I only try to generate pools in contained areas, which isn't really as interesting (no water levels; just water rooms) but would keep things more self-contained and manageable.
21:44
<@gnolam>
I don't really get the problem I'm afraid.
21:47
<@Derakon>
If water impacts your mobility, then tunnels must know if they contain water before they construct their interiors.
21:47
<@Derakon>
And ideally, they'd be constructed differently above and below the waterline.
21:47
<@Derakon>
It's a huge can of worms, basically.
21:48
<@Derakon>
Hrmph. Think I'll just postpone that issue for now.
21:50
<@gnolam>
Ah.
21:50
<@gnolam>
Hmm... you were using some sort of tree graph to generate this, right?
21:50
<@Derakon>
Yes, but that's nearly useless for purposes of analyzing the actual map structure.
21:51
<@Derakon>
Here's a recent map: http://derakon.dyndns.org/~chriswei/games/jbrl/maze/8.png
21:52
<@gnolam>
Since there are no cycles, couldn't you just mark certain leaf nodes with a parent node above it as "water nodes"?
21:52
<@Derakon>
There are cycles.
21:52
<@Derakon>
Take a look at the upper-left.
21:53
<@gnolam>
Ah. Missed that one.
21:53
<@gnolam>
So just tree/ish/ and not an actual tree then.
21:53
<@gnolam>
But come to think of it, leaf nodes should still be fair game.
21:53
<@Derakon>
Internally it's a tree; however, some nodes lie on top of other nodes.
22:45
<@Derakon>
Okay, time to start moving mobility information from globals into tree nodes, so that they can be different for different regions.
22:45
<@Derakon>
I think the main two are vertical and horizontal distance between platforms.
22:46
<@Derakon>
Children inherit their parents' mobility information, and may make it more strict, but not less.
22:51 gnolam [lenin@Nightstar-1382.A163.priv.bahnhof.se] has quit [Ping Timeout]
23:06 gnolam [lenin@Nightstar-1382.A163.priv.bahnhof.se] has joined #Code
23:06 mode/#code [+o gnolam] by ChanServ
23:08
<@Derakon>
Right, that appears to effectively control platform density...
23:12
<@Derakon>
Compare http://derakon.dyndns.org/~chriswei/games/jbrl/mapgen22a.png to http://derakon.dyndns.org/~chriswei/games/jbrl/mapgen22b.png
23:12
<@Derakon>
The larger interval's no guarantee that the place will be *inaccessible* to players who can't jump very high, of course.
23:13
<@Derakon>
And that's actually my biggest long-term issue: I don't mind players sequence-breaking, but it needs to be impossible to get stuck.
23:20 * Derakon ponders ways to create "buildings" inside of cave rooms.
23:20
<@Derakon>
I might be able to do this with an adapted version of the mazemaking algorithm, actually...
23:21
<@Derakon>
Hrm, no, don't think so.
23:21
<@Derakon>
What I'm looking for is a large, open room with a building inside of it which would have regularly-shaped rooms with passageways between them.
23:25 Doctor_Nick [~nick@Nightstar-17219.tampfl.dsl-w.verizon.net] has quit [Operation timed out]
23:28 Doctor_Nick [~nick@Nightstar-17219.tampfl.dsl-w.verizon.net] has joined #code
23:29 mode/#code [+o Doctor_Nick] by ChanServ
23:33
<@Derakon>
Hrm...maybe if I could find some way to convert the negative space surrounding some subset of the tree into open space...
23:33
<@Derakon>
That really breaks the tree structure all to hell, though.
23:37 * Vornicus ponders random generation himself.
23:37
<@Derakon>
Hmph...not only does it break the tree structure, but I'd really need to do it during the spacefiller algorithm, which is heavily dependent on the tree structure to make sensible crossroads.
23:39
<@Derakon>
I'd need something like "this cell type gives way before all other cell types, but passes through walls," and then go back and open up the tunnel endpoints for children of the node that planted that seed.
23:40
<@Derakon>
s/seed/cell.
23:40
<@Derakon>
But that kind of tiered approach sounds hairy as all get out.
--- Log closed Sat Mar 28 00:00:14 2009
code logs -> 2009 -> Fri, 27 Mar 2009< code.20090326.log - code.20090328.log >