Fire's null_new_depth

Code, algorithms, languages, construction...
benstoker
Posts: 110
Joined: Thu Jun 10, 2010 7:32 pm
Real Name: Ben Stoker

Fire's null_new_depth

Post by benstoker » Thu Feb 24, 2011 12:23 am

I was looking at Sentinel's Fire code, and null_move.h. It appears as though the null_new_depth function defined in null_move.h is not actually used. The code looks interesting and I was wondering if I am wrong and that the null move routines do indeed use this. Can you enlighten me?

Here's the code:

Code: Select all

static int null_new_depth ( int depth, int delta ) {
    double ddelta, r;
    uint64 Nodes = 0;
    int cpu, rp;
    for ( cpu = 0; cpu < NumThreads; cpu++ )
        for ( rp = 0; rp < RPperCPU; rp++ )
            Nodes += RootPosition[cpu][rp].nodes;
    ddelta = MIN ( ( double ) delta, 225.0 );
    if ( depth < 34 || Nodes <= 5000 * 1000 )
        r = 8 + sqrt ( ddelta * depth ) / 60.0;
    else if ( depth < 46 || Nodes <= 200000 * 1000 )
        r = 10 + sqrt ( ddelta * depth ) / 60.0;
    else
        r = 12 + sqrt ( ddelta * depth ) / 60.0;
    return ( depth - ( int ) r );
}

Sentinel
Posts: 122
Joined: Thu Jun 10, 2010 12:49 am
Real Name: Milos Stanisavljevic

Re: Fire's null_new_depth

Post by Sentinel » Thu Feb 24, 2011 2:18 am

benstoker wrote:I was looking at Sentinel's Fire code, and null_move.h. It appears as though the null_new_depth function defined in null_move.h is not actually used. The code looks interesting and I was wondering if I am wrong and that the null move routines do indeed use this. Can you enlighten me?
It's an improved idea of Dann's smooth scaling. It's not used in Fire 1.3. In earlier versions it can be activated through UCI parameter (NMR_SCALING).

benstoker
Posts: 110
Joined: Thu Jun 10, 2010 7:32 pm
Real Name: Ben Stoker

Re: Fire's null_new_depth

Post by benstoker » Tue Mar 01, 2011 1:31 am

Sentinel wrote:
benstoker wrote:I was looking at Sentinel's Fire code, and null_move.h. It appears as though the null_new_depth function defined in null_move.h is not actually used. The code looks interesting and I was wondering if I am wrong and that the null move routines do indeed use this. Can you enlighten me?
It's an improved idea of Dann's smooth scaling. It's not used in Fire 1.3. In earlier versions it can be activated through UCI parameter (NMR_SCALING).
Sentinel, while I'm at it, could you explain what your addition, below, to evaluation.h and .c does?

From evaluation.h

Code: Select all

static const int WBPinValue[16] =
    {
    0, 0, Score(6, 6), 0, 0, 0, Score(4, 4), Score(4, 4), 0, 0, Score(8, 8), 0, 0, 0, Score(15, 15), Score(15, 15)
    };
static const int BBPinValue[16] =
    {
    0, 0, Score(8, 8), 0, 0, 0, Score(15, 15), Score(15, 15), 0, 0, Score(6, 6), 0, 0, 0, Score(4, 4), Score(4, 4)
    };
static const int WRPinValue[16] =
    {
    0, 0, Score(6, 6), 0, Score(4, 4), Score(4, 4), 0, 0, 0, 0, Score(6, 6), 0, Score(4, 4), Score(4, 4), 0, 0
    };
static const int BRPinValue[16] =
    {
    0, 0, Score(6, 6), 0, Score(4, 4), Score(4, 4), 0, 0, 0, 0, Score(6, 6), 0, Score(4, 4), Score(4, 4), 0, 0
    };

#define WBPinTarget (bBitboardQ | bBitboardR)
#define WRPinTarget (bBitboardQ)
#define BBPinTarget (wBitboardQ | wBitboardR)
#define BRPinTarget (wBitboardQ)
And in evaluation.c

Code: Select all

        if(WRPinTarget & Ortho[b])
            {
            T = between[b][LSB(WRPinTarget & Ortho[b])] & (wBitboardOcc | bBitboardOcc);

            if((T &(T - 1)) == 0)
                Value += WRPinValue[Position->sq[LSB(T)]];
		}

RobertP
Posts: 8
Joined: Tue Jun 22, 2010 8:36 am
Real Name: Robert Purves
Location: New Zealand

Re: Fire's null_new_depth

Post by RobertP » Tue Mar 01, 2011 11:40 am

benstoker wrote: ...could you explain what your addition, below, to evaluation.h and .c does?
This is standard bitboard pin-detection adapted to positional scoring for pins or discovered attack (wR -> bQ). With the proviso that I haven't seen the full source (i.e. based only on the code snippets):
On entry, square b evidently contains a wR.
See comments:

Code: Select all

        if(WRPinTarget & Ortho[b]) // square b is on same rank or file as a bQ
            {
            T = between[b][LSB(WRPinTarget & Ortho[b])] & (wBitboardOcc | bBitboardOcc); // all blockers of both colors

            if((T &(T - 1)) == 0) // 0 or 1 blocker
                Value += WRPinValue[Position->sq[LSB(T)]]; // credit depending on the blocking piece
		}
Robert P.

kranium
Posts: 55
Joined: Mon Aug 02, 2010 10:49 pm
Real Name: Norman Schmidt

Re: Fire's null_new_depth

Post by kranium » Fri Apr 22, 2011 12:52 am

benstoker wrote:
Sentinel wrote:
benstoker wrote:I was looking at Sentinel's Fire code, and null_move.h. It appears as though the null_new_depth function defined in null_move.h is not actually used. The code looks interesting and I was wondering if I am wrong and that the null move routines do indeed use this. Can you enlighten me?
It's an improved idea of Dann's smooth scaling. It's not used in Fire 1.3. In earlier versions it can be activated through UCI parameter (NMR_SCALING).
Sentinel, while I'm at it, could you explain what your addition, below, to evaluation.h and .c does?

From evaluation.h

Code: Select all

static const int WBPinValue[16] =
    {
    0, 0, Score(6, 6), 0, 0, 0, Score(4, 4), Score(4, 4), 0, 0, Score(8, 8), 0, 0, 0, Score(15, 15), Score(15, 15)
    };
static const int BBPinValue[16] =
    {
    0, 0, Score(8, 8), 0, 0, 0, Score(15, 15), Score(15, 15), 0, 0, Score(6, 6), 0, 0, 0, Score(4, 4), Score(4, 4)
    };
static const int WRPinValue[16] =
    {
    0, 0, Score(6, 6), 0, Score(4, 4), Score(4, 4), 0, 0, 0, 0, Score(6, 6), 0, Score(4, 4), Score(4, 4), 0, 0
    };
static const int BRPinValue[16] =
    {
    0, 0, Score(6, 6), 0, Score(4, 4), Score(4, 4), 0, 0, 0, 0, Score(6, 6), 0, Score(4, 4), Score(4, 4), 0, 0
    };

#define WBPinTarget (bBitboardQ | bBitboardR)
#define WRPinTarget (bBitboardQ)
#define BBPinTarget (wBitboardQ | wBitboardR)
#define BRPinTarget (wBitboardQ)
And in evaluation.c

Code: Select all

        if(WRPinTarget & Ortho[b])
            {
            T = between[b][LSB(WRPinTarget & Ortho[b])] & (wBitboardOcc | bBitboardOcc);

            if((T &(T - 1)) == 0)
                Value += WRPinValue[Position->sq[LSB(T)]];
		}

I'm responsible for the addition of the pin code, not Sentinel..
not sure why you assume it was his idea...
In fact, unfortunately, he had very little to do with the last version of Fire...was very busy at work.

This pin code was originally presented in RobboLito TA, by Thinkingalot...
I'm convinced that it doesn't actually add any ELO benefit however...
but it makes sense and I couldn't help but add it.

mcostalba
Posts: 91
Joined: Thu Jun 10, 2010 11:45 pm
Real Name: Marco Costalba

Re: Fire's null_new_depth

Post by mcostalba » Fri Apr 22, 2011 7:11 am

kranium wrote: I'm convinced that it doesn't actually add any ELO benefit however...
but it makes sense and I couldn't help but add it.
IMHO if it doesn't add any ELO then it doesn't make sense at all to add it.

Anyhow your sentence is interesting because with few words you perfectly synthesize what IMHO is the wrong way to approach engine development, in particular:

1) "I'm convinced that": there is nothing to be convinced, or tests prove it works or prove it doesn't. Stop, nothing more. Tests are the only metric that we apply to evaluate a change.

2) " it makes sense to add": As already explained above, adding stuff with no proven ELO increase is just the fastest way to build up a complete mess out of a good source base.

Of course everybody uses the approach he prefers, my comment simply says that we use a completely different approach in SF and we are happy with that ! :-)

kranium
Posts: 55
Joined: Mon Aug 02, 2010 10:49 pm
Real Name: Norman Schmidt

Re: Fire's null_new_depth

Post by kranium » Tue Apr 26, 2011 3:31 pm

mcostalba wrote:
kranium wrote: I'm convinced that it doesn't actually add any ELO benefit however...
but it makes sense and I couldn't help but add it.
IMHO if it doesn't add any ELO then it doesn't make sense at all to add it.

Anyhow your sentence is interesting because with few words you perfectly synthesize what IMHO is the wrong way to approach engine development, in particular:

1) "I'm convinced that": there is nothing to be convinced, or tests prove it works or prove it doesn't. Stop, nothing more. Tests are the only metric that we apply to evaluate a change.

2) " it makes sense to add": As already explained above, adding stuff with no proven ELO increase is just the fastest way to build up a complete mess out of a good source base.

Of course everybody uses the approach he prefers, my comment simply says that we use a completely different approach in SF and we are happy with that ! :-)
It's not completely black and white Marco... I believe there is a 'gray' area left for the developer's intuition and instinct.
IMO, sometimes (often) the true strength of an engine is only really known after 1000s of LTC testing.

There exists a lot of debate as to:
when does any particular test 'empirically' prove something...?
(especially chess testing at ultra fast TC).

I realize that you believe SF is doing well, and that you have had success with your testing methods,
but maybe you should reconsider them...SF seems to be lagging far behind, and Ippolit source code explains it all quite clearly.

i.e. If your testing techniques are superior, and perhaps empirical....then why is StockFish not as strong than the rest of the field?

mcostalba
Posts: 91
Joined: Thu Jun 10, 2010 11:45 pm
Real Name: Marco Costalba

Re: Fire's null_new_depth

Post by mcostalba » Tue Apr 26, 2011 3:59 pm

kranium wrote: i.e. If your testing techniques are superior, and perhaps empirical....then why is StockFish not as strong than the rest of the field?
Because we are not able to come up with winning ideas: we test a lot, but for the most part candidate changes result in no ELO change or even in a weaker engine.

Regarding the rest of the field, apart from Houdini, we think we are almost already there...

hyatt
Posts: 1242
Joined: Thu Jun 10, 2010 2:13 am
Real Name: Bob Hyatt (Robert M. Hyatt)
Location: University of Alabama at Birmingham
Contact:

Re: Fire's null_new_depth

Post by hyatt » Tue Apr 26, 2011 7:24 pm

Intuition only takes you so far. Often only as far as "the crash scene" or something similar. :)

(My intuition said that the road would be open in spite of a flash flood warning...)

I, like you, prefer actual testing...

UncombedCoconut
Posts: 44
Joined: Thu Jun 10, 2010 1:43 am
Real Name: Justin Blanchard
Location: United States

Re: Fire's null_new_depth

Post by UncombedCoconut » Wed Apr 27, 2011 9:45 am

kranium wrote:There exists a lot of debate as to:
when does any particular test 'empirically' prove something...?
(especially chess testing at ultra fast TC).

I realize that you believe SF is doing well, and that you have had success with your testing methods,
but maybe you should reconsider them...SF seems to be lagging far behind, and Ippolit source code explains it all quite clearly.

i.e. If your testing techniques are superior, and perhaps empirical....then why is StockFish not as strong than the rest of the field?
Is the alternative to follow your gut through a sequence of 95%-certainly-<=-0 changes until you wind up >=+50?
I have a program on which such techniques are likely to work. Perhaps I'll release its next version next April Fools' Day.

There is often room in SF-level programs to make logical, strengthening changes. (As an example, SF can clear hash on "ucinewgame" if the previous game was Fischer-random.) And when combined such changes will be worth very little...

Post Reply