CppCon 2016: Robert Ramey “Safe Numerics Library'

preview_player
Показать описание


This presentation describes the necessity, utility and usage for a library of safe integer types. These types function in all respects the same way as built-in integers, but guarantee that no integer expression will return an incorrect result. The library can be reviewed at the boost library incubator.

Robert Ramey
Robert Ramey Software Development
Proprietor
Santa Barbara, CA


*-----*
*--*
*-----*
Рекомендации по теме
Комментарии
Автор

Thanks for watching this video - any comments you want to make are welcome. If you want download the slides (with audio) You can Google "Robert Ramey software" to find my website. (you tube won't let me put a url here).

robertramey
Автор

9:00 "You're not using safe integers" -- haha, that guy was so mean. :D

drumetul_dacic
Автор

This is providing a rather well tuned view on what is "wrong" with numeric varibles. Actually c/c++ is just a wrapper layer on top of the assembly instruction set of some quite free to choos target processor. (there would be macro assembly as an intermediate level - lets ignore it at this place.) the truth is that data types and their operations on assembly level can lead to overflows and underflows and many more. some of this issues are already evident at compilation times - others are not and thus will unveil sometimes during runtime. its the duty of the developer to check the results and (i am sorry to say) assembly gives the full freedom of taking all advantage of what the CPU/ALU/FPU vendor added as features (and educated assembly coders will make much use of that) whilst in c/c++ the means are noticeably reduced (and the major amount of coders will not even make use of the remaining set of options to check). - having reasonably good codes means that this code will come with a reasonably to insanely height amount of checks on various places including the entry, processing and exit phases of functions.

rationale: classical fixed size variable designs have their very systematic limits. its a shared story between compilers, runtime environment (joint software and hardware facilities) and the code written to bring the thing not only to run well in a sample case but to keep it running well for all cases possible. - needless to say that safety critical systems including software were said to have a 1:4 ratio between the time used for writing the codes and the time used for testing the codes. in this terms testing means for some part automated testing on function level, module level or system level(s) but as well code reviews and the various methods of semi-automatic code checking with third party tools. and not to forget: selecting a computation language (your programming toolset) that has a noticeable amount of built in self checks and error protection methodology. (and then dont forget to apply that!) The more your tooling pipeline gets right on its own the less errors you can finally run in. increasing the warning level of your compiler is only a very basic first step - and the various compilers out there in the wild are really not at all behaving the same for this aspect, so you will definitely see good benefit in running a multitude of compilers on the very same piece of code.

BTW the assert() example from the video would have worked much better for error tracking if the constant values were put into macros and then the macros were used in the assert() rather then using their derived but typed variables. and please admit, a modern compiler with the checks enabled will yell onto you when numeric constants will not match the variable size as well as it will yell onto you (probably as soon as the code optimizer is turned on) if some constant data objects will create a range violation in some formula.

alexanderstohr
Автор

It boggles my mind that something like this isn't standard yet. Correctness will always be more important than performance.

Spiderboydk
Автор

Doesn't int8_t x = 127; int8_t y = 2; x + y; actually give undefined behaviour?

arkadye