elcuervo has quit [Read error: Connection reset by peer]
elcuervo has joined #ruby
adu has quit [Quit: adu]
chouhoulis has quit [Remote host closed the connection]
chouhoulis has joined #ruby
akem has joined #ruby
alfiemax has joined #ruby
alfiemax has quit [Ping timeout: 240 seconds]
Rudd0 has quit [Ping timeout: 240 seconds]
ur5us has joined #ruby
duderonomy has joined #ruby
akem has quit [Ping timeout: 260 seconds]
dionysus69 has joined #ruby
ChmEarl has quit [Quit: Leaving]
drincruz has joined #ruby
dputtick has quit [Ping timeout: 260 seconds]
CustosLimen has quit [Ping timeout: 240 seconds]
clinth has quit [Read error: Connection reset by peer]
dputtick has joined #ruby
clinth has joined #ruby
CustosLimen has joined #ruby
drincruz has quit [Ping timeout: 272 seconds]
dionysus69 has quit [Ping timeout: 240 seconds]
zipzapzapzip has joined #ruby
carbone5 has joined #ruby
dhollinger has quit [Ping timeout: 256 seconds]
dhollinger has joined #ruby
ur5us has quit [Ping timeout: 244 seconds]
carbone5 has quit [Quit: carbone5]
lxsameer has quit [Ping timeout: 260 seconds]
ur5us has joined #ruby
wallace_mu has quit [Remote host closed the connection]
wallace_mu has joined #ruby
jordanm has quit [Read error: Connection reset by peer]
jordanm has joined #ruby
wallace_mu has quit [Ping timeout: 256 seconds]
zipzapzapzip has quit [Quit: Leaving]
ur5us has quit [Ping timeout: 260 seconds]
zacts has joined #ruby
ur5us has joined #ruby
jordanm has quit [Remote host closed the connection]
jordanm has joined #ruby
gix- has joined #ruby
gix has quit [Disconnected by services]
braj has quit []
ur5us has quit [Ping timeout: 260 seconds]
zacts has quit [Ping timeout: 272 seconds]
Rudd0 has joined #ruby
swaggboi has quit [Quit: C-x C-c]
alfiemax has joined #ruby
alfiemax has quit [Ping timeout: 258 seconds]
swaggboi has joined #ruby
dualfade has quit [Ping timeout: 260 seconds]
dualfade has joined #ruby
_whitelogger has joined #ruby
chouhoulis has quit [Remote host closed the connection]
bocaneri is now known as Sauvin
alfiemax has joined #ruby
cnsvc has quit [Quit: WeeChat 2.9]
Rudd0 has quit [Ping timeout: 260 seconds]
cd has quit [Quit: cd]
alfiemax has quit [Ping timeout: 256 seconds]
endorama has quit [Ping timeout: 246 seconds]
endorama has joined #ruby
brainfunnel has joined #ruby
weaksauce has quit [Ping timeout: 272 seconds]
BSaboia has joined #ruby
AndreYuhai has joined #ruby
<AndreYuhai>
Hey there, can I have one queue for two Sidekiq workers? Would they be able to pick up their own jobs.
ruurd has quit [Quit: ZZZzzz…]
chouhoulis has joined #ruby
Mia has joined #ruby
Xiti has joined #ruby
tuttzza has joined #ruby
_whitelogger has joined #ruby
<jhass>
AndreYuhai: sure, as long as both run the same code. That's esssentially how do you do larger scaling with sidekiq
Caerus has quit [Ping timeout: 256 seconds]
Caerus has joined #ruby
dionysus69 has joined #ruby
jenrzzz has joined #ruby
<AndreYuhai>
jhass, Why or when would I need to create two workers that run the same code? Why not just run one with more threads?
<AndreYuhai>
jhass, In my case I have workers doing different things. I was wondering whether they can work on the same queue.
<jhass>
they could run on separate machines for example
supercoven has joined #ruby
<AndreYuhai>
jhass, Oh that's right.
<yxhuvud>
I believe any worker for a queue is supposed to be able to handle all jobs on that queue.
<AndreYuhai>
I thought maybe there is a way to differentiate jobs for the workers so they only pick their jobs :D
<jhass>
but sometimes you also get better performance from scaling processes instead of threads (due to things like lock contention)
<jhass>
or you simply want some redundancy
<AndreYuhai>
yxhuvud, That's what I thought
<jhass>
in case one crashes
<yxhuvud>
That is possible. It is called a Queue.
<AndreYuhai>
yxhuvud, yes but is it possible for Sidekiq, like a feature?
<jhass>
yeah, the point of queues, besides prioritization, is so you could have disjoint worker code, though I would not recommend it, it's easy to get wrong
<yxhuvud>
so if you want them separate, you create multiple queues and configure the workers to only pick up jobs from the queues they care about
jenrzzz has quit [Ping timeout: 256 seconds]
<jhass>
there's non Ruby sidekiq worker implementations, so you could have non-Ruby jobs, but that's about the only legit scenario I see for running different worker code
<AndreYuhai>
Yes, now I have two workers each working on its own queue. I was just wondering whether that's possible two make them work on the same queue or when we would need that.
<jhass>
it's your idea, I don't know :D
<jhass>
(it's not possible, talking about the need part)
<yxhuvud>
jhass: well there are some cases I can think of, mostly to do with sidekiq extensions.
supercoven has quit [Max SendQ exceeded]
supercoven has joined #ruby
braincrash has joined #ruby
lxsameer has joined #ruby
supercoven has quit [Max SendQ exceeded]
supercoven has joined #ruby
brianj has joined #ruby
<brianj>
Um how do I check if an element is present in a hash? I tried #include? and #member? Im looking at the docs at https://ruby-doc.org/core-2.6.3/Hash.html and i dont see which one to use..
<jhass>
brianj: a key, a value or a specific combination?
<jhass>
has_key?/member?/include? are all aliases fwiw
<brianj>
jhass: well yeah..but this is ruby..shouldnt I have a specific method for that?
<jhass>
why?
<jhass>
I guess you could .to_a.include?([1, 2]) but I would call that out in code review as being terribly inefficient
<brianj>
jhass: well I need to check for existance of a specific kombination of of big hash.. would you then do if my_hash.has_key?(my_key) && my_hash[my_key] == my_value ?
<jhass>
my `nil` a valid value in this scenario?
<brianj>
as my_key / my_value ?
<jhass>
as my_value
<jhass>
and do you need to disambiguate the case of key => nil from existing in the hash or not
<brianj>
my_value wont be nil
<jhass>
so I'd just skip the has_key? part then
<jhass>
if it's not present it'll return nil which will never == my_value
alfiemax has joined #ruby
<brianj>
ok thanks. I dont understand why include? cant take a key pair..
<jhass>
what you're passing is not a key pair, it's another hash of its own