core: ofnode: Have ofnode_read_u32_default return a u32
authorTrent Piepho <tpiepho@impinj.com>
Fri, 10 May 2019 17:48:20 +0000 (17:48 +0000)
committerSimon Glass <sjg@chromium.org>
Tue, 21 May 2019 23:33:23 +0000 (17:33 -0600)
commitb061ef39c350c288542536b09dc01d9e984a12ac
treed8bab333c9261a53eb0669f8d2595d4de3028e4a
parent347ea0b63eb5143bf0e48aba65a41f50999367f0
core: ofnode: Have ofnode_read_u32_default return a u32

It was returning an int, which doesn't work if the u32 it is reading,
or the default value, will overflow a signed int.

While it could be made to work, when using a C standard/compiler where
casting negative signed values to unsigned has a defined behavior,
combined with careful casting, it seems obvious one is meant to use
ofnode_read_s32_default() with signed values.

Cc: Simon Glass <sjg@chromium.org>
Signed-off-by: Trent Piepho <tpiepho@impinj.com>
drivers/core/ofnode.c
include/dm/ofnode.h