Einstein said "You must learn to distinguish between what is true and what is real". An apt longer quote of his is: "As far as the laws of mathematics refer to reality, they are not certain; and as far as they are certain, they do not refer to reality". I.e. it is "true" that the three angles of a triangle add up to 180 in Euclidean geometry of the plane, but it is not known how to show that this could hold in our physical universe (if there is any mass or energy in our universe then it doesn't seem to hold, and it is not actually known what our universe would be like without any mass or energy).
So, science is a relationship between what we can represent and are able to think about, and "what's out there": it's an extension of good map making, most often using various forms of mathematics as the mapping languages. When we guess in science we are guessing about approximations and mappings to languages, we are not guessing about "the truth" (and we are not in a good state of mind for doing science if we think we are guessing "the truth" or "finding the truth"). This is not at all well understood outside of science, and there are unfortunately a few people with degrees in science who don't seem to understand it either.
Sometimes in math one can guess a theorem that can be proved true. This is a useful process even if one's batting average is less than .500. Guessing in science is done all the time, and the difference between what is real and what is true is not a big factor in the guessing stage, but makes all the difference epistemologically later in the process.
One corner of computing is a kind of mathematics (other corners include design, engineering, etc.). But there are very few interesting actual proofs in computing. A good Don Knuth quote is: "Beware of bugs in the above code; I have only proved it correct, not tried it."
An analogy for why this is so is to the n-body problems (and other chaotic systems behaviors) in physics. An explosion of degrees of freedom (3 bodies and gravity is enough) make a perfectly deterministic model impossible to solve analytically for a future state. However, we can compute any future state by brute force simulation and see what happens. By analogy, we'd like to prove useful programs correct, but we either have intractable degrees of freedom, or as in the Knuth quote, it is very difficult to know if we've actually gathered all the cases when we do a "proof".
So a guess in computing is often architectural or a collection of "covering heuristics". An example of the latter is TCP/IP which has allowed "the world's largest and most scalable artifact—The Internet—to be successfully built. An example of the former is the guess I made in 1966 about objects—not that one could build everything from objects—that could be proved mathematically—but that using objects would be a much better way to represent most things. This is not very provable, but like the Internet, now has quite a body of evidence that suggests this was a good guess.
Another guess I made long ago—that does not yet have a body of evidence to support it—is that what is special about the computer is analogous to and an advance on what was special about writing and then printing. It's not about automating past forms that has the big impact, but as McLuhan pointed out, when you are able to change the nature of representation and argumentation, those who learn these new ways will wind up to be qualtitatively different and better thinkers, and this will (usually) help advance our limited conceptions of civilization.
This still seems like a good guess to me—but "truth" has nothing to do with it.