- The Strand bookstore
- McNally Jackson Books
I am not good at adapting to change. The slightest alteration in things that I am used to---my room, my ringtone, the orientation of my study table---bothers me. In April, 2018, I graduated from IIT Kharagpur (Kgp) with a Bachelor's and Master's in Maths and Computing. It is 31st of May today and I am still in Kgp. Most of my friends left for home weeks ago.
How do you cope with change? Say you are dancing the best dance of your life in the best party you have ever been to and suddenly you have to go attend a different party. You must. How do you prevail on yourself?
I have been introspecting about the best course of action for the better part of my last academic semester.
Recycling and waste disposal are two chief concerns of modern world. Governments in rich countries spend a lot of money on it and yet across the world untreated garbage is pilling up. What if we could come up with an alternate system of pricing commodities that incentivizes recycling/disposal? For such a system to be sustainable it has to be profitable in terms of trade and not require much external support. I have one such proposal. Note that this is purely hypothetical so I haven't dwelled too much upon its practicality.
The basic idea is to include expected cost of disposing the commodity into its price. The standard way of pricing products takes into account only the effort that went into its production. For example, an Iphone is more expensive to make than a low-end Samsung phone and hence they are priced accordingly. If you include the cost of their disposal into their prices, then both of them get more expensive (but the Samsung is still cheaper). So, nothing interesting. Yet.
Consider two carry bags - a standard polythene bad and a jute bag. A polythene bag is way cheaper than a jute bag---something that encourages its use. But once you include the cost of their disposal, the situation is flipped. Plastic disposal is difficult while jute is bio-degradable. In terms of disposal cost, plastic-disposal will be much more expensive than jute disposal. Thus, effectively in the proposed system, a jute bag will become cheaper than a polythene bag.
One can similarly argue that more eco-friendly commodities will be preferred in this system. Not only that, we might even see more research into ways to dispose of difficult-to-dispose stuff. To promote such efforts, we might even allow trade of disposal credits.
This question is purely philosophical and as philosophical questions usually go, it is not well-defined. So, let us try to do that first. What does it mean for something to be invented? In the case of physical objects there is not really much confusion. The telephone is considered a human invention because it did not exist in any physical form before its invention and all its components are man-made. On the other hand, America was discovered by Columbus because it had existed for a long time before Columbus landed there.
In the case of Mathematics, we are in a spot because Mathematics is abstract. It need not have any physical manifestation to be valid. In fact, development of an axiomatic description of mathematical concepts (like counting and sets) is considered to be a milestone in the evolution of modern mathematics. One argument in favour of treating it as an invention is as follows: Mathematics is built upon self-obvious truths or axioms. Mathematics simply draws logical consequences of these axioms. Hence, all the mathematical tools that we have---numbers, logic---are purely a construct of the human mind. Further, discovery implies prior existence. In our case, the question is who could have thought of these concepts before humans did. (This question is not precise because we do not have a concrete definition of thinking beyond the human brain.)
To consider Mathematics a discovery calls for first proving prior
existence. Note that it is not important to know who the
(if there was one). Hence, we can skip getting into a religious terrain.
Now, despite the possibility of having totally arbitrary axioms, they
do have to be logically consistent and most of mathematics actually
makes use of axioms that
make sense. For example, Peano's axioms
that are the basis of our usual number system are very intuitive. What
intuitive and what is not is determined to a large part by our
experiences in our surroundings (call it nature). For such mathematical
concepts that are influenced by natural experience, it is perhaps
reasonable to call them discoveries.
So, recently I gave up on my Jekyll blog. Jekyll is the most popular static website generator, but for some reason I could not really get a hang of it. After hours of struggle to do some customizations, I figured it would be quicker if I started from scratch.
I have shifted to Nikola which is another static site generator. The biggest selling point is that it is written in Python and assumes minimal usage of its documentation.
I hope the journey will be less painful this time.
This was the last week of my Google Summer of Code project on fast series expansions for SymPy and SymEngine. It has thoroughly been an amazing experience, challenging and rewarding in more ways than I had imagined. I was extremely lucky to have such awesome mentors as Ondrej and Thilina.
Though I couldn't achieve all that I had planned in my proposal, it taught me what I think is my biggest take away from the experience- things seldom work the way you want them to. In fact, I faced maximum difficulties in that part of my project which I had assumed to be trivial- ring series in SymPy. And as it turned out, it was the corner stone of what I had set out to do, and it needed to be done well.
Fortunately, things turned out rather well there and now most of the difficult
questions with regard to ring series have been answered.
ring series now has
a general purpose
rs_series function that expands an arbitrary SymPy
expression really fast. Most of the important internal functions are also
implemented now. I think as a module
ring series has reached a stage where it
can be helpful to people and others can help with improving and expanding it. Of
course, a crazy amount of work still needs to be done and for that we need a lot
of helping hands.
I have been writing a guide as well as documenting the internals in PR 9839. The importance of good documentation is another lesson I learnt during my project.
The most important thing is that people use these new capabilities. I hope more
people will get involved. If all goes well, it is a potential replacement of the
series method of SymPy.
Other than that, I had a very fruitful discussion with Ondrej about how to implement polynomials and then series expansion in SymEngine. You can read the summary here. I am already excited about writing all this new stuff.
The end of GSoC is not really an end; it is a beginning, of more interesting times :)
Last week I told you why
rs_series doesn't work with negative or fractional
powers because of the constraints of a polynomial back-end and why we need to
polys. The situation isn't that hopeless actually. Let's talk about
negative and fractional powers one by one.
The reason negative exponents work in
ring_series is because I modified
PolyElement to allow so. In hind sight, it wasn't the right decision and needs
to be replaced with something that doesn't alter
polys. It is rather
surprising that I came across a possible solution so late (Now we know why good
documentation is so important).
I already knew that
polys allows us to create a
FractionField. A fraction
field over a domain
R consists of elements of the form
R. In our case we are interested in the fraction field of polynomial
ring, i.e, fractions with polynomials as numerator and denominator. So
a * b**(-1) but is
a / b, where a and b are polynomials.
What was new to me was that just like
polys also has sparse
In effect, it allows us to create sparse rational functions without altering
I modified some functions in
ring_series to work with a rational function
field here, and it works quite well indeed.
In : from sympy.polys.fields import * In : F, a, b = field('a, b', QQ) In : p = rs_sin(a + b, a, 3)/a**5 In : p*a Out: (-3*a**2*b - 3*a*b**2 + 6*a - b**3 + 6*b)/(6*a**4)
Note that all these are field operations and I haven't modified
any way. Elegant!
But then again, having a field increases the complexity as we need to evaluate the numerator and denominator separately.
Fractional powers are a much trickier case as there is no simple solution to it as
above. What we can do is optimise the option I had presented in my last post,
i.e, have each fractional power as a generator. But doing that opens up a
Pandora's box. Simple things such as
sqrt(a)**2 == a do not hold true any
more. The current
sqrt(a) as a constant if we are expanding
with respect to
In : rs_series(sin(a**QQ(1,2)), a**QQ(1,2),5) Out: -1/6*(sqrt(a))**3 + (sqrt(a)) In : rs_series(sin(a**QQ(1,2)), a,5) Out: sin(sqrt(a))
So, if we indeed decide to tread this path, we would need to replace
sqrt(a)**2. This really complicates the situation as we need to figure
out what to replace with. In any calculation the powers change multiple times
and each time we'll need to figure out how to rewrite the series.
It is now mostly a design decision whether we want the
ring_series to be
confined within the
polys module. The
polys environment allows efficient
manipulations of laurent series (with
FracField), but I doubt we can achieve
the speed we want with puiseux series without modifying
polys. One possible
solution is to separate the modified parts of
polys along with
polys. We are using
polys only because it has the data structure that
we want. Separating them would allow us to simultaneously make use of its
back-end and not introduce unnecessary complexity in our representation.
Other than that, documentation is another priority now. I had planned to do it earlier too, but couldn't. This week's discovery has reminded me of its importance.
Sorry for the delayed post! Last week was extremely busy.
It's time to wrap up my work. The good new is that
rs_series (I called it
series_fast earlier) works well for taylor series. The speedups are impressive
and it can handle all sorts of cases (so far!). Now, I need to make it work for
ring_series functions work well for negative and fractional powers,
ideally that shouldn't be difficult. However, my current strategy is to add
variables as generators to the currently used ring. The backend of creating
rings is in
polys, which doesn't allow negative or fractional powers in the
generators (that is the mathematical definition of polynomials). For example:
In : sring(a**QQ(2,3)) Out: (Polynomial ring in a**(1/3) over ZZ with lex order, (a**(1/3))**2) In : _.gens Out: ((a**(1/3)),)
Contrast this with:
In : sring(a**2) Out: (Polynomial ring in a over ZZ with lex order, a**2)
Generators with negative or fractional powers are treated as symbolic atoms and not as some variable raised to some power. So these fractional powers will never simplify with other generators with the same base.
The easy way to fix this is to modify
sring but that would mean changing the
core polys. I am still looking for a better way out.
The polynomial wrappers PR had
been lying dead for quite some time. It currently uses piranha's
it needs to work on
unordered_set when piranha is not available. I am adding
that here. It is mostly done,
decode functions. Once the wrappers are in, I can
rs_series work for puiseux series.
Complete polynomial wrappers.
Port the low level
I spent a good amount of time this week trying to make the series function more
intelligent about which
ring it operates on. The earlier strategy of using the
EX ring proved to be slow in many cases. I had discussions with Kalevi
Suominen, who is a developer at SymPy and we figured
out the following strategy:
The user inputs a Basic
expr. We use
sring over QQ to get the starting ring.
We call individual functions by recursing on
expr has a constant
term we create a new ring with additional generators required by the ring (e.g
sin and cos in case of
rs_sin and expand
expr over that.)
This means that each
ring_series function now can add generators to the ring
it gets so that it can expand the expression.
This results in considerable speed-up as we do operations on the simplest
possible ring as opposed to using
EX which is the most complex (and hence
slowest) ring. Because of this, the time taken by the
series_fast function in
marginally more than direct function calls. The function doesn't yet have code
for handling arbitrary expressions, which will add some overhead of its own.
Most of the extra time is taken by
sring. The overhead is constant, however (for
a given expression). So for
series_fast(sin(a**2 + a*b), a, 10) the extra
routines take about 50% of the total time (the function is 2-4 x slower). For
series_fast(sin(a**2 + a*b), a, 100), they take 2% of the total time and the
function is almost as fast as on
There is, of course scope for speedup in
sring (as mentioned in its file).
Another option is to minimise the number of calls to
sring, if possible to
just one (in the
In my last post I talked about the new
solveset module that
Harsh and Amit are
working on. I am working with them and I sent a
patch to add a
domain argument to
solveset function. It is pretty cool stuff in that solution is always
guaranteed to be complete.
I haven't been yet been able to start porting the code to Symengine as the
Polynomial wrappers are not yet ready. Hopefully, they should be done by the next
week. Till, then I will focus on improving
series_fast and any interesting
issues that come my way.
Write a fully functional
series_fast. Profile it properly and optimise it.
Document the functions and algorithms used in