Chamok hasanChamok Hasan is a Bangladeshi writer and mathematician. He and Hasibul Ahsan were in same department in buet. They are known to each other for long time.
Chamok Hasan has a good skill of teaching mathematics.
Chamok Hasan calculus
Limit, after all, is such a concept which was developed using logic. The defintion of limit was rooted in the idea of converging quantities:
If a quantity approach another quantity making the difference between them less than any arbitrary small number in some interval of time then they become ultimately equal at the end.
This was Newton's idea of converging numbers. But he was never able to base calculus on perfect definition of limit. He knew calculus worked but never questioned about its validity. Later a mathematician named Cauchy defined limit mathematically. Limit is the value which some function approaches but never reaches it. For more explanation follow this page.
Mathematics follow logic. It took a long time to establish connection between mathematics and logic. The most notable endeavour to find such connection was undertaken by Bertrand Russell and Dr. Alfred North Whitehead. Unlike we arevery mistaken, all mathematics are logical deduction from a very few logical premises. Pure mathematics has shown that ten logical premises and almost twenty principles of deductions are enough to define whole of mathematics from algebra to geometry. This is how it goes.
Pure mathematics contains no constant except logical contant and consequently no premises, or indemonstrable propositions, but such as are concerned exclusively with logical constants and variables . It is precisely this that distinguishes pure mathematics from applied mathematics. In applied mathematics, results which have been shown by pure mathematics to follow from some constant satisfying the hypothesis in question. Thus terms which were variables become constants, and a new presmise is always required, namely: this particular entity satisfies the hypothesis in question. Thus for example Euclidean Geometry , as a branch of pure mathematics, consists wholly of propositions having hypothesis "S is a Euclidean space" . If we go on to : "The space that exists is Euclidean" this enables us to assert of the space that exists the consequents of all the hypothetical constituting Euclidean Geometry, where now the variables S is replaced by the constant "actual space". But by this step we pass from pure to applied mathematics.
Implication of pure mathematics can be put in a truth table like this :
The gemoetrical interpretation of calculus can be put like this:
The derivative of a function at a point x is the slope of the tangent drawn on that point. So tangent is actually defined through differentiation. It always finds the rate of change of some function f(x) at specific value of independent variable x. For more explanation vist this page.
IntegrationIntegration is the inverse process of differentiation. It is a process of summing infinite number of quantities to find the area under a curve. We can define integration in the following way:
If we diffentiate the integral then we get the original function back. This is called the fundamental theorem of calculus.
Some more integration formulas are given :
chaotic system is very sensitive to initial conditions.
Cantor's Theorem2^|S| > |s| where |S| is the cardinality of the set S. If S is a finite set then its cardinality is the number of elements in it, and things are not very interesting. But the concept of cardinality makes sense also for infinite sets. The power set of a set is the set of its subsets. It is easy to see that for finite sets S the cardinality of the power set equals 2^|S|. Thus we denote by 2^|S| the cardinality of the power set even for infinite sets S. Cantor's Theorem states that the cardinality of the power set of a set S always exceeds the cardinality of S itself. That's obvious for finite sets but far from trivial for infinite sets.
"A physical law must possess mathematical beauty"
A key equation for statistical mechanics formulated by Ludwig Boltzmann. It relates the entropy of a macrostate (S) to the number of microstates corresponding to that macrostate (W). A microstate describes a system by identifying the properties of each particle, this involves microscopic properties such as particle momentum and particle position. A macrostate designates collective properties of a group of particles, such as temperature, volume and pressure. The key thing here is that multiple different microstates can correspond to the same macrostate. Therefore, a simpler statement would be that the entropy is related to the arrangement of particles within the system (or the 'probability of the macrostate'). This equation can then be used to derive thermodynamic equations such as the ideal gas law.
THINKING ABOUT THE UNIVERSE
WE LIVE IN A STRANGE AND wonderful universe. Its age, size, violence, and beauty require extraordinary imagination to appreciate. The place we humans hold within this vast cosmos can seem pretty insignificant. And so we try to make sense of it all and to see how we fit in. Some decades ago, a well-known scientist (some say it was Bertrand Russell) gave a public lecture on astronomy. He described how the earth orbits around the sun and how the sun, in turn, orbits around the center of a vast collection of stars called our galaxy. At the end of the lecture, a little old lady at the back of the room got up and said: "What you have told us is rubbish. The world is really a flat plate supported on the back of a giant turtle." The scientist gave a superior smile before replying, "What is the turtle standing on?" "You’re very clever, young man, very clever," said the old lady. "But it’s turtles all the way down!" Most people nowadays would find the picture of our universe as an infinite tower of turtles rather ridiculous. But why should we think we know better? Forget for a minute what you know—or think you know—about space. Then gaze upward at the night sky. What would you make of all those points of light? Are they tiny fires? It can be hard to imagine what they really are, for what they really are is far beyond our ordinary experience. If you are a regular stargazer, you have probably seen an elusive light hovering near the horizon at twilight. It is a planet, Mercury, but it is nothing like our own planet. A day on Mercury lasts for two-thirds of the planet’s year. Its surface reaches temperatures of over 400 degrees Celsius when the sun is out, then falls to almost —200 degrees Celsius in the dead of night. Yet as different as Mercury is from our own planet, it is not nearly as hard to imagine as a typical star, which is a huge furnace that burns billions of pounds of matter each second and reaches temperatures of tens of millions of degrees at its core. Another thing that is hard to imagine is how far away the planets and stars really are. The ancient Chinese built stone towers so they could have a closer look at the stars. It’s natural to think the stars and planets are much closer than they really are—after all, in everyday life we have no experience of the huge distances of space. Those distances are so large that it doesn’t even make sense to measure them in feet or miles, the way we measure most lengths. Instead we use the light-year, which is the distance light travels in a year. In one second, a beam of light will travel 186,000 miles, so a lightyear is a very long distance. The nearest star, other than our sun, is called Proxima Centauri (also known as Alpha Centauri C), which is about four light-years away. That is so far that even with the fastest spaceship on the drawing boards today, a trip to it would take about ten thousand years. Ancient people tried hard to understand the universe, but they hadn’t yet developed our mathematics and science. Today we have powerful tools: mental tools such as mathematics and the
scientific method, and technological tools like computers and telescopes. With the help of these tools, scientists have pieced together a lot of knowledge about space. But what do we really know about the universe, and how do we know it? Where did the universe come from? Where is it going? Did the universe have a beginning, and if so, what happened before then? What is the nature of time? Will it ever come to an end? Can we go backward in time? Recent breakthroughs in physics, made possible in part by new technology, suggest answers to some of these long-standing questions. Someday these answers may seem as obvious to us as the earth orbiting the sun—or perhaps as ridiculous as a tower of turtles. Only time (whatever that may be) will tell. 2
Relatively few problems in quantum mechanics have exact solutions, and thus most problems require approximations. Perturbation theory is a useful method of approximation when a problem is very similar to one that has exact solutions. The approximate results differ from the exact ones by a small correction term. Perturbation theory fails when the correction terms are not small. Consider a set of eigenfunctions and eigenvalues of a given Hamiltonian operator: ! H ˆ (0) "n (0) = En (0) "n (0) (1) Here the label n identifies a specific solution in a set and the superscript (0) denotes the “order” of approximation. The set of functions ψ form an orthonormal basis. Zeroth-order approximation means exact. Now, if we consider a first-order correction such that the true Hamiltonian is ! H ˆ = H ˆ (0) + H ˆ (1) (2). We can expect the corrected wavefunction for a state n to be of the form ! "n = "n (0) + i#n $ ci "i (0) (3), where the summation runs over all other states i in the basis set and ci are real coefficients in the linear expansion. This implies that the corrected wavefunctions are not normalized. It can be shown that (to first order) the mixing coefficients have the values ! ci (1) = "i (0) H ˆ (1) "n (0) En (0) # Ei (0) , i $
Law of thermodynamics
A briefer history of time by S. Hawking
A brief history of time by S. Hawking
Grand Design by Stephen Hawking
perihelion of mercury by Feynman