21 Jan, 2010, JohnnyStarr wrote in the 1st comment:
Votes: 0
Not to derail this thread, but I was wondering if someone could display the Ruby equivalent of Merc's SET_BIT macro, or REMOVE_BIT?
I have read up on bitwise operators, but I cant seem to figure out how to get it to work.
21 Jan, 2010, kiasyn wrote in the 2nd comment:
Votes: 0
i would just use a hash or something

ch.flags = {}
ch.flags[:afk] = true
ch.flags[:likes_cake] = true
ch.flags[:npc] = false


but if you really wanted bits you can do
BIT_ONE = (1 << 0)
BIT_TWO = (1 << 1 )
# …

class BitFlags
attr_accessor :bits

def initialize
@bits = 0
end
def set_bit( bit )
@bits |= bit
end
def remove_bit( bit )
@bits &= ~bit
end
end


edit for code tags
21 Jan, 2010, Runter wrote in the 3rd comment:
Votes: 0
Yes, I prefer the higher level way in Ruby. It really depends on exactly what you want this flags for. Well, the biggest determinate is how many flags you plan on using.

Another option, however..if you really needed to, would be to write a Ruby class in C. This would give you precise control over how the flags are handled.
21 Jan, 2010, quixadhal wrote in the 4th comment:
Votes: 0
The question usually isn't "How do I do bit-flags?", it's "How do I save and look up boolean values?". The Diku folks didn't use bits because they're the best way to do things… they used them because they were trying to run their games on tinker-toy-sized computers where 1000 mobs using an extra 28K of memory would actually matter.

If you're running Ruby, you already aren't trying to run in the tightest memory space possible, so why cling to a mechanism that trades space for complexity? Just use associative arrays and do things by name. Much easier to read, much easier to write, probably easily serializable if you want to save stuff that way.

No offense to Ruby by the way. But it's an interpreter, and the overhead of running the interpreter negates any savings from using bits the way Diku used bits, unless you have hundreds of thousands of things perhaps.
21 Jan, 2010, David Haley wrote in the 5th comment:
Votes: 0
Quote
If you're running Ruby, you already aren't trying to run in the tightest memory space possible, so why cling to a mechanism that trades space for complexity? (…) But it's an interpreter, and the overhead of running the interpreter negates any savings from using bits the way Diku used bits

This is kind of like saying: we're already spending $1,000, who cares if we spend another $100? Or: you're already spending $1,000 more, so saving $100 doesn't matter.
Just because one is running in an interpreter doesn't mean that one should be completely careless when it comes to resource usage. In fact, it's easy to shoot yourself in the foot by not caring and storing stuff dumbly. One day you run out of memory, and oh crap, you don't know why and have to go back and reexamine a bunch of stuff. But if you keep memory in mind as you go, you are likely to have a much better idea of where stuff was spent.

It's not at all unreasonable to assume that storing each flag using symbols like that will cost you 24 bytes or more per flag (memory for the thing being stored, overhead in the table, etc.). With 32 flags you have almost 1k of memory being used; with 10,000 things with flags (10,000 things isn't really that much) you have 7,680,000 bytes being used on flags alone. (It's worth noting that you would probably only store things if they're true and if they're missing consider them false; this would cut down on your usage as well.)

Does this mean that you should use actual bits? Probably not. But I disagree with the notion that you should just go do stuff without caring at all about anything related to resource usage.
21 Jan, 2010, Runter wrote in the 6th comment:
Votes: 0
Quote
No offense to Ruby by the way. But it's an interpreter, and the overhead of running the interpreter negates any savings from using bits the way Diku used bits, unless you have hundreds of thousands of things perhaps.


That entirely depends upon the size of the system in question. And something I'm not prepared to say is definitively true—there are very contrived examples that would provide anecdotal evidence, but I'm sure people can come up with examples of massive (by comparison) use of flags.
21 Jan, 2010, quixadhal wrote in the 7th comment:
Votes: 0
David Haley[/ur] said:
It's not at all unreasonable to assume that storing each flag using symbols like that will cost you 24 bytes or more per flag (memory for the thing being stored, overhead in the table, etc.). With 32 flags you have almost 1k of memory being used; with 10,000 things with flags (10,000 things isn't really that much) you have 7,680,000 bytes being used on flags alone. (It's worth noting that you would probably only store things if they're true and if they're missing consider them false; this would cut down on your usage as well.)


A whopping 7M of RAM? Sure, it's worth noting, but…. perspective please. Running imcruby1.1, which is about as barebones a network application as you can get, uses up 9M of RAM sitting at the login prompt. How much memory does the server you run your game on have? Mine has 512M of RAM. My firewall, which does nothing but handle DHCP, DNS, timed, and tossing packets around has 256M.

I think this falls into the penny-wise and pound-foolish category.

Don't get me wrong… being aware of memory use is a good thing. I just don't think the kind of micro-optimization you get from bitfields is worth the pain and suffering involved in using them and maintaining them. If you find yourself running out of memory because of this small a difference, it's time to retire the old Commodore 64 and get an el-cheapo barebones system.
21 Jan, 2010, Tyche wrote in the 8th comment:
Votes: 0
JohnnyStarr said:
Not to derail this thread, but I was wondering if someone could display the Ruby equivalent of Merc's SET_BIT macro, or REMOVE_BIT?
I have read up on bitwise operators, but I cant seem to figure out how to get it to work.


As kiasyn noted the bit operations work pretty much the same as C's. One nice thing that happens as a side-effect of Fixnum's being automatically promoted to Bignum's, is the size of the bitset is dynamic and can change when using the operations.

irb(main):003:0> fl |= 1 << 1
=> 2
irb(main):004:0> fl.class
=> Fixnum
irb(main):005:0> fl |= 1 << 200
=> 1606938044258990275541962092341162602522202993782792835301378
irb(main):006:0> fl.class
=> Bignum
21 Jan, 2010, David Haley wrote in the 9th comment:
Votes: 0
Runter said:
but I'm sure people can come up with examples of massive (by comparison) use of flags.

AI applications such as planning where you reduce the problem to a propositional satisfaction test. Writing this code in C++ is annoying, but you have to be careful when writing it in Ruby/Lua/Python/etc. because when you have tens of thousands of propositions and a search space containing tens of thousands of nodes – each containing every proposition – things get big really fast. And these are "small" planning problems.

quixadhal said:
A whopping 7M of RAM? Sure, it's worth noting, but…. perspective please.

Perspective please as well – I said it probably isn't worth using literal bits. :wink: That doesn't mean that one shouldn't think about it. I think we all agree on this.
21 Jan, 2010, elanthis wrote in the 10th comment:
Votes: 0
Quote
And these are "small" planning problems.


Quite true. Not that you said they were, but thankfully, those are irrelevant to games. ;)

(Though some game devs try to over-engineer that stuff anyway. I guarantee you've never played any of their games, though, since the result of approaches like that either result in the game development lasting for years with no goal in sight and get cancelled or the game just has crappy gameplay and AI and ends up going straight to the bargain bin. And then there's the neural-network proponents… we try not to ridicule them, but it's just so easy.)
06 Jul, 2010, Runter wrote in the 11th comment:
Votes: 0
Necro, but a recent post made me think of this one.

Anywho, here's what I've currently got going on.


module HasFlags
def method_missing(sym, *args)
type = sym.to_s[-1..-1]
if type == "?"
@_flags ||= {}
return @_flags[sym.to_s[0..-2].intern] ? true : false
end

if type == "!"
@_flags ||= {}
sym = sym.to_s[0..-2].intern
if @_flags[sym]
@_flags.delete(sym)
else
@_flags[sym] = true
end
return
end
raise
end
end

player = Object.new # Anything
player.extend(HasFlags)

# Query to see if something has flags.
if player.is_a?(HasFlags)
puts "Yeap."
end

# toggle flags
player.drunk!
player.mute!
player.anything_else_to_set!

puts player.drunk?
puts player.mute?
puts player.some_flag_never_mentioned?

# still fails on other method missings.
begin
puts player.fail()
rescue
puts "Failed."
end


http://codepad.org/CFnnJgBN

Yeap.
true
true
false
Failed.


I'm still working on the elegance of method_missing, but I feel a little better about this wrapper approach.

Something like this still works if there were concerns over method missing conflicts.

class Flags
include HasFlags
end

ch.flags = Flags.new
0.0/11