Let’s see now. Never in the history of white presidents has a white president ever been accused of not loving his country, even when he sent it off into useless costly wars at the expense of tens of thousands of soldiers (Vietnam, Iraq) and practically crashed its economy along that of the world’s, but somehow when there’s a black man in the White House, all hell breaks loose and suddenly the black president is accused by white people of not loving his country. And that’s not about race? Really? Seriously? When it comes from right wingers with a longing to return to a racist past when white men were real men and all was right with America and blacks knew their place in it?