[Bug 1829415] [NEW] printing bitset to_string().c_str() in template produces junk
P Touchman
1829415 at bugs.launchpad.net
Thu May 16 16:20:44 UTC 2019
Public bug reported:
I am trying to print a bitset with printf using a template, but I keep getting junk on the output.
I can print a bitset with
printf("%s",std::bitset<32>(valuetoprint).to_string().to_cstr());
but I thought it would be nice to do that with a template so I could
just go:
printf("%s",mybitset<32>(valuetoprint));
template <const int mysize,class mytype> const char * mybitset(mytype myobj){ return std::bitset<mysize>(myobj).to_string().c_str(); }
template <const int mysize> const char * mybitset2(int myobj){ return
std::bitset<mysize>(myobj).to_string().c_str(); }
template <const int mysize> const char * mybitset3(int myobj){ return
std::bitset<mysize>(myobj).template
to_string<char,std::char_traits<char>,std::allocator<char>>().c_str(); }
template <const int mysize> const char * mybitset4(int myobj){ return
std::bitset<mysize>(myobj).template
to_string<char,std::char_traits<char>,std::allocator<wchar_t>>().c_str();
}
template <const int mysize> std::string mybitset5(int myobj){ return
std::bitset<mysize>(myobj).template
to_string<char,std::char_traits<char>,std::allocator<char>>(); }
printf("testing bitset32 %s " ,std::bitset<32>(0x255+0xf000).to_string().c_str());
printf("testing bitset16 %s " ,std::bitset<16>(0x255+0xf000).to_string().c_str());
printf("testing mybitset32 %s " ,mybitset<32,int>(0x255+0xf000));
printf("testing mybitset8 %s " ,mybitset<8,int>(0x255+0xf000));
printf("testing mybitset2 16 %s " ,mybitset2<16>(0x255+0xf000));
printf("testing mybitset2 15 %s " ,mybitset2<15>(0x255+0xf000));
printf("testing mybitset2 8 %s " ,mybitset2<8>(0x255+0xf000));
printf("testing mybitset2 16 %s " ,mybitset2<16>(0x255+0xf000));
printf("testing mybitset3 15 %s " ,mybitset3<15>(0x255+0xf000));
printf("testing mybitset3 8 %s " ,mybitset3<8>(0x255+0xf000));
printf("testing mybitset3 16 %s \n",mybitset3<16>(0x255+0xf000));
printf("testing mybitset4 16 %s \n",mybitset4<16>(0x255+0xf000));
std::cout << "mybitset5" << mybitset5<16>(0x255+0xf000) << "\n";
and the output I get is:
testing bitset32 00000000000000001111001001010101 testing bitset16 1111001001010101 testing mybitset32 `w�3V testing mybitset8 01010101 testing mybitset2 16 `w�3V testing mybitset2 15 111001001010101 testing mybitset2 8 01010101 testing mybitset2 16 `w�3V testing mybitset3 15 111001001010101 testing mybitset3 8 01010101 testing mybitset3 16 `w�3V
testing mybitset4 16 `w�3V
mybitset51111001001010101
So my template works fine up until I try to make it print a bitset of
size 16. It works fine for a bitset of size 15 but not for 16.
I also tried this compiling with clang and it did the same thing,
outputting garbage, working fine for a template with bitset size 15 and
garbage for 16.
** Affects: gcc-7 (Ubuntu)
Importance: Undecided
Status: New
--
You received this bug notification because you are a member of Ubuntu
Foundations Bugs, which is subscribed to gcc-7 in Ubuntu.
https://bugs.launchpad.net/bugs/1829415
Title:
printing bitset to_string().c_str() in template produces junk
Status in gcc-7 package in Ubuntu:
New
Bug description:
I am trying to print a bitset with printf using a template, but I keep getting junk on the output.
I can print a bitset with
printf("%s",std::bitset<32>(valuetoprint).to_string().to_cstr());
but I thought it would be nice to do that with a template so I could
just go:
printf("%s",mybitset<32>(valuetoprint));
template <const int mysize,class mytype> const char * mybitset(mytype myobj){ return std::bitset<mysize>(myobj).to_string().c_str(); }
template <const int mysize> const char * mybitset2(int myobj){ return
std::bitset<mysize>(myobj).to_string().c_str(); }
template <const int mysize> const char * mybitset3(int myobj){ return
std::bitset<mysize>(myobj).template
to_string<char,std::char_traits<char>,std::allocator<char>>().c_str();
}
template <const int mysize> const char * mybitset4(int myobj){ return
std::bitset<mysize>(myobj).template
to_string<char,std::char_traits<char>,std::allocator<wchar_t>>().c_str();
}
template <const int mysize> std::string mybitset5(int myobj){ return
std::bitset<mysize>(myobj).template
to_string<char,std::char_traits<char>,std::allocator<char>>(); }
printf("testing bitset32 %s " ,std::bitset<32>(0x255+0xf000).to_string().c_str());
printf("testing bitset16 %s " ,std::bitset<16>(0x255+0xf000).to_string().c_str());
printf("testing mybitset32 %s " ,mybitset<32,int>(0x255+0xf000));
printf("testing mybitset8 %s " ,mybitset<8,int>(0x255+0xf000));
printf("testing mybitset2 16 %s " ,mybitset2<16>(0x255+0xf000));
printf("testing mybitset2 15 %s " ,mybitset2<15>(0x255+0xf000));
printf("testing mybitset2 8 %s " ,mybitset2<8>(0x255+0xf000));
printf("testing mybitset2 16 %s " ,mybitset2<16>(0x255+0xf000));
printf("testing mybitset3 15 %s " ,mybitset3<15>(0x255+0xf000));
printf("testing mybitset3 8 %s " ,mybitset3<8>(0x255+0xf000));
printf("testing mybitset3 16 %s \n",mybitset3<16>(0x255+0xf000));
printf("testing mybitset4 16 %s \n",mybitset4<16>(0x255+0xf000));
std::cout << "mybitset5" << mybitset5<16>(0x255+0xf000) << "\n";
and the output I get is:
testing bitset32 00000000000000001111001001010101 testing bitset16 1111001001010101 testing mybitset32 `w�3V testing mybitset8 01010101 testing mybitset2 16 `w�3V testing mybitset2 15 111001001010101 testing mybitset2 8 01010101 testing mybitset2 16 `w�3V testing mybitset3 15 111001001010101 testing mybitset3 8 01010101 testing mybitset3 16 `w�3V
testing mybitset4 16 `w�3V
mybitset51111001001010101
So my template works fine up until I try to make it print a bitset of
size 16. It works fine for a bitset of size 15 but not for 16.
I also tried this compiling with clang and it did the same thing,
outputting garbage, working fine for a template with bitset size 15
and garbage for 16.
To manage notifications about this bug go to:
https://bugs.launchpad.net/ubuntu/+source/gcc-7/+bug/1829415/+subscriptions
More information about the foundations-bugs
mailing list