Yesterday we talked about the loop product and the associated Gerstenhaber and
BV structures on the loops space of the manifold. I’m going to put the equivariantpart of the story on hold for a little bit and come back to it.
Since the first papers of Chas-Sullivan, others have taken these ideas and made
them more rigorous. To my knowledge, Cohen-Jones was the first one. This usesthe language of spectra. There are sort of two camps, and the other perspectiveis chain complexes. I put myself in the chain complex camp. I want to use myperspective to say what is going on in spectra, but I won’t ever say that word. Ifyou know about spectra, you’ll be able to see what I’m trying to say. It’s kind oflike a religion, and a different religion. I’m a Christian trying to explain Buddhism:“their god is named Buddha and they’re all vegetarian.” So if you’re a Buddhist,I hope not to offend you.
I want to give a perspective on the Thom collapse and the loop product. If a
and b are homology classes represented by submanifolds A and B, then I look ata map from A × B to M × M . We’re interested in where they intersect, so thediagonal map, and we want to look at the fiber product A ×M B:
So we looked at a map A ×M B → M , dimension |a| + |b| − d. This let us make awrongway map H∗(M ) ⊗ H∗(M ) → H∗(M ). Let N be a tubular neighborhood of∆(M ) ⊂ M ×M . I’m talking about the total space of the disk bundle of the normalbundle in M × M . The ingredients are going to be the Thom collapse, which, well,remember what we care about is the transversal preimage of the diagonal. We canget rid of things away from the diagonal by collapsing. I’ll crush everything outsidethe neighborhood N of ∆ to a point. This is M × M τ
The next ingredient is the Thom class. I want relative homology and cohomology
classes. Look at the map (M × M, ∅) → (M × M, M × M − N ). Then I get a mapin the other direction H∗(M × M ) ← H∗(M × M, M × M − N ). The Thom class ofM is a cohomology class u in H∗(M × M, M × M − N ), degree d, whose pullbackj∗(u) is Poincar´e dual to the fundamental class of M pushed forward under thediagonal: ∆∗[M ].
I’m thinking of u as supported on N . So evaluating this on a class that is
Theorem 1. Given N as small as you like, there exists such a u.
This isn’t stated precisely. It’s in McCrory, 1971, and Bott-Tu. Before, you may have seen the Thom isomorphism for the cohomology of bundles.
I’ll state it for homology. If I look at H∗(M × M, M × M − N ), that’s isomorphicto H∗−d(M ), where you should think you are capping with the Thom class.
Then we have what we need for homology intersection. Let’s take two classes,
put them in the tensor product H∗(M ) ⊗ H∗(M ) ∼
along τ∗ to H∗(M × M, M × M − N ), then to H∗−d(M ) by the Thom isomorphism,and this composition is •.
In general, if you have an embedding of compact manifolds M ֒→, let N be
a tubular neighborhood of e(M ), then you can define a map on homology. Theusual induced map would go from N to X, but you can define a wrongway mape! : H∗(X) τ
→ H∗(X, X − N) → H∗−d(M). Here τ goes from X to X − N. I can
Cool. So what we want to do is take this idea and apply it to the loop product. I want to place what we did yesterday in the context we’ve been discussing, let
Back to the loop product: How did we define the loop product? We took two
chains in the loop space, visualized them as chains of loops in the manifold, andon intersection loci we could concatenate loops. We really care about, for the loopproduct, intersection in M . It’s not like we’re trying to do intersection theory inthe loop space. It might appear like that, and then it would be amazing, this isinfinite dimensional, but we’re really in M . Let’s see how we can use this kind ofperspective to say what we said yesterday. I’m thinking that we can make anotherfiber product
We have two loops and a point in M , so (γ, γ′, p) ∈ LM × LM × M so that(ev × ev)(γ × γ′) = ∆(p). So you’re saying that the basepoint of both of these is p. We can identify this with the space {(γ, γ′) ∈ LM × LM : γ(0) = γ′(0)} You candraw a picture of this. This is two loops that share a basepoint. This looks like amap of the figure eight into the manifold. Let me write this, then, as M ap(8, M ). This is a circle where two points are identified. You can also think of it as a mapfrom a circle where two points agree. This should end up with a point in the loopspace.
The diagonal is a codimension d embedding. It’s an exercise to show that what
you have above is also a codimension d embedding. You have infinite dimensionalstuff but a finite codimension embedding, and then you have these Thom tools atyour disposal. I have a tubular neighborhood of LM ×M LM sitting in LM × LM .
N is the preimage under (ev×ev) of N , this is a tubular neighborhoodof the image of the pullback of ∆
We’ll do what we did before in the compact world, and things will work because
of the finite codimension. So the Thom collapse will take us from LM × LM to thequotient of the complement LM ×LM/LM ×LM − ˜
relate the relative homology H∗(LM × LM, LM × LM − ˜
What else do I need? Let’s note: I’ve got M aps(8, M ) → LM , what is that
map? It’s the concatenation. It’s useful to think of the figure eight as the circlewith two points identified.
Now we’re about ready to do it. Let’s define the loop product as we did for the
intersection product, replacing all the M s with LM s.
So take H∗(LM ) ⊗ H∗(LM ), that’s isomorphic to H∗(LM × LM ), we can push-
forward by τ∗ to H∗(LM × LM, LM × LM − ˜
in H∗−d(M aps(8, M )), and then we concatenate to land in H∗−d(LM ). Then this
composition is the product • from last time. There hasn’t been any transversality. This is better if details are important for you, maybe.
What I’d like to do next is start introducing surfaces. We’ve pushed this product
pretty far, but we’re going to start generalizing these. Depending on how well Isay this, well, let’s look at surfaces. I’ll draw one picture that makes everythingcompletely transparent.
[Pair of pants with figure eight as a deformation retract]Somehow, in terms of what we’re trying to do, this pair of pants is better than
the figure eight. There are two circles at the top and one at the bottom. These areshowing us something about the map. I don’t care how big the surface is, I carethat it’s a little bigger than the figure eight, and it shows me that I’m starting withtwo and ending with one.
I want to draw two diagrams that are kind of the same. We have maps LM × LM ← M aps(8, M ) → LM . I can say, if I have a pair of
pants P , I can restrict P to the inputs or outputs:
The Thom collapse let us make a wrong-way map on homology, from the homologyof LM × LM to the homology of M aps(8, M ). So now this gives a map H∗(LM ) ⊗
= H∗(LM × LM ) → H∗−d(M aps(P, M )) → H∗(LM ).
Maybe it’s obvious to you guys. If we didn’t look at this as coming from a surface,
you wouldn’t know what to do next. What if we replace this with some other surfacewith boundary. What we’re going to want to do next is generalize. This diagram,if I replace it with a general surface, a surface with boundary components labeledas incoming or outgoing, k incoming and ℓ outgoing, if I had a map from Σ to Mthen I could restrict to the outputs and get a point in LM ℓ, to the inputs and getsomething in LM k, and what we want to do is realize, somehow, ρin as a finitecodimension embedding, so we can reverse this arrow on homology. If we could do
→ H∗shift(Maps(Σ, M)) → H(LM)⊗ℓ.
So this naturally leads us to a k to ℓ operation coming from Σ. This is µΣ, say.
I think I’ll stop here for now. I went over yesterday and I’m slightly under, run
and get your caffeine, and we’ll come back and talk about how to do this. How dowe realize this as a finite codimension embedding? The figure eight is what did itfor us. What is the analog of the figure eight for such a Σ? We’ll talk in a coupleminutes.
Welcome back, where are my notes?I forgot that I would be giving my talks back to back. Let me recall what we
just did. We have a nice characterization of the loop product, consider the figureeight as a deformation retract of the pair of pants. We can see this as a two toone operation. We wanted to look at LM 2 ← M aps(P, M ) → LM , by restrictingto the input and output boundary. We see that since the pair of pants containsthe figure eight as a deformation retract, that we use M aps(8, M ) for M aps(P, M ).
The construction gave us ρin!, the wrong way map on homology, and then we coulduse the usual induced map ρout∗, and that composition is the loop product. Theway we can generalize this is by taking k inputs and ℓ outputs, and doing the exactsame thing. Now I can replace P with Σ, and the shift will be different. When Irestrict to the inputs I have k maps from the circle, and at the end I have ℓ loops. No the full composition is
I’m going to leave string topology, but what our goal is to ask, what is the analoguefor the figure eight when I want to generalize these things. That’s what we’re gettingat here. The pair of pants is good because it tells us this is a 2 to 1 operation, thefigure eight tells us about the actual intersections.
Let me introduce the tools to address this question, fatgraphs.
Definition 1. A graph Γ is a 1-dimensional CW complex.
We call the 0-cells vertices and the 1-cells edges.
Definition 2. A fatgraph Γ is a graph together with a cyclic order of the edgesadjacent to each vertex
I can draw a picture using the orientation of the chalkboard. Let me tell you
why this is better than combinatorial. There is a construction taking a fatgraphand spitting out an orientable surface with boundary.
We take Γ to Σ(Γ), the ribbon surface. I want to take this graph and fatten it
to a surface. The thing that I can do, I can fatten each vertex to a little disk, andeach edge to a band. I can use the cyclic order to tell me how to attach the bandsto the disk.
Make sure that you don’t twist. This contains the graph as a deformation retract.
This contains Γ as a deformation retract or a spine.
Since we have this deformation retract, sometimes it’s not always clear what the
topological type of your surface, but we know that the Euler characteristic of a graphΓ is #V − #E, and since we also know that χ(Σ(Γ)) = 2 − 2g − #∂ components, wecan set these equal to each other. It’s not too hard to count boundary components,this picture has three, and so you can set these two calculations equal and see (inthis case it’s not too hard, the graph is planar) that the genus is 0.
Let’s see how this is different if we change the cyclic order: For me, I have to
use the blackboard order, so I’ll draw this:
You can do the same calculation, see that χ(Γ) = 2 − 3 = −1 and count we haveonly one boundary component, we get only one, so we calculate −1 = 2 − 2g − 1 sog = 1. This is a pcunctured torus.
I think of the fat graphs and ribbon surfaces as being the same thing. This isn’t
literally true, but because of this I will conflate boundary cycles and boundarycircles.
Definition 3. A metric graph Γ is a graph with a metric (edges have lengths).
Now we have something continuous that we can define, edge lengths.
Definition 4. A marked metric fatgraph is a metric fatgraph together with a dis-tinguished point on each boundary cycle.
The next definition, possibly not due to Sullivan:
Definition 5. (Cohen-Godin)A Sullivan chord diagram of type (g, k, ℓ) is a marked metric fatgraph with nounivalent vertices constructed from k disjoint circles (the inputs) and a disjointunion of trees of genus g with k + ℓ boundary components, k of which are isotopicto the k circles.
The extra stuff isn’t a tree. Well, the scandal is, Sullivan hates that they call it aSullivan chord diagram because he thinks that these should be included, and Cohensays that Sullivan didn’t ever say that, so, you can probably guess who I believe.
This graph wants to keep track of the intersection where the two points on theopposite side of the chord coincide. I want to pick up where those are equal. If Itry to say that in a degenerate diagram, something will be overdefined. On FridayI’ll be able to get rid of that condition.
Let me make a remark. (Marked) metric fatgraphs of type (g, n) form a space
with continuous parameters the lengths. Call this (M )M F G(g, n). Then Sullivanchord diagrams are a subspace Sull(g, k, ℓ) ⊂ M M F G(g, k + ℓ).
Remember, the diagram I want to reproduce in the more general setting is this
Let’s do this for a fixed Sullivan chord diagram Γ. We’ll start with k input circles,so that’ll be LM k.
Then the idea is that the endpoints of the trees on the input circles tell you what
you are trying to intersect. What is the Sullivan chord diagram that will give methe loop product? It’s this:
So we want to identify the endpoints of this chord, say that they intersect at thispoint. So we can make a construction S from Sullivan chord diagrams to markedmetric fatgraphs by collapsing trees.
[Example picture]So S(Γ) is what an intersection would look like. Let V in(Γ) be the vertices
on the inputs. Let M V = M aps(V, M ), which, I have two possible V s in mind,V in(Γ), or V (S(Γ)). Here’s a fact. The map S induces a map V in(Γ) to V (S(Γ)). It then induces a map on the mapping spaces in the opposite direction M V (S(Γ)) →M V in(Γ). If you think about them as ordered, this is just a Cartesian product, theneverything is okay, and the map is just the diagonal. The codimension is |χ|d.
Here’s my claim, we’ve discovered the analogue of the figure eight. S(Γ) is the
The vertical map on the right, I have k inputs and I want points in M , these shouldbe in correspondence with V in(Γ). These points on the graph tell you where theevaluation maps should occur. It’s about as much work to see that M aps(S(Γ), M )is the fiber product or pullback.
I’m going to go a couple of minutes over. Over here before, we had a codimension
d embedding that let us get a wrongway map on homology. I have a codimension
|χ|d embedding, so that was what I wanted so that I could apply the Thom collapse. What does that diagram look like?
Instead of the figure eight, I have S(Γ), so I get
It’s a finite codimension embedding so I can reverse the arrow on homology and get
Why are we working with Sullivan chord diagrams? The figure eight doesn’t
show us the inputs. Here we know what the input circles are, and that everythingelse is an output circle. So the first sale is that we have this but it’s hard to see,later I’m going to show you a construction that allows you to do something withthe chords. That’s a couple of reasons to think that the chord diagrams are a littlebit better.
So far, basically every algebraic structure we have described has been an “alge-
bra,” meaning that it has operations which take many inputs and have one output. We talked briefly about coassociative coalgebras, which have operations with oneinput and many outputs. Next I want to expand the world of the discussion some totalk about “gebras” in general, which can have operations that are many to many. Let me start with the example of Frobenius algebras.
Recall that the cochains of a space are a dga, and the cohomology are a com-
mutative dga, while the chains are a dg coassociative coalgebra and the homologya cocommutative dg coassociative coalgebra.
If the space is a closed n-manifold, you can use Poincar´e duality to identify
the degree k homology with the degree n − k cohomology, which means that foran n-manifold M , the homology H∗M (or cohomology, but whatever) is both acommutative algebra and a cocommutative coalgebra; because this isomorphismchanges the degree, the product map is degree −n instead of degree 0. Let mesay as a side note, you don’t need to use the duality isomorphism, then the cupproduct, then the duality isomorphism to describe the product; you can insteadchoose representatives of the homology classes you want to multiply that intersectone another transversally, and then the product will be the homology class of thetransversal intersection of these representatives. Kate has talked about this.
So we have an algebra structure and a coalgebra structure on H∗(M ). But these
are not just an algebra and coalgebra that know nothing about one another; youcan check that they are compatible in a very specific way, so that together theyform the structure of an open Frobenius algebra, which I’m about to define.
I want to be a little careful here, because different people would call different
things a Frobenius algebra. I know something like nine different nonequivalentversions of what a Frobenius algebra is, depending on whether it has a counit, aunit, an inner product, a co-inner product, whether you assume nondegeneracy,and so on. So for me,
Definition 6. A Frobenius algebra is a triple (F, µ, ∆) where F is a chain complex,(F, µ) is a commutative dga, (F, ∆) is a cocommutative dg coalgebra, and µ and ∆
satisfy the “Frobenius compatibility condition:”
Because the product is commutative and the coproduct is cocommutative, this
implies three other equations by acting on the left and on the right by σ ∈ S2. Iwon’t write them down, but they all have the same left hand side and are similar.
Some people will describe this relation in different ways, with tensors or by
saying something like “the coproduct is a module map over the algebra. Frobeniusalgebras are also closely related to two-dimensional topological field theories. Notthis version of Frobenius algebras, but one of the other, closely related ones are inbijection with functors from the 2-dimensional cobordism category to the categoryof chain complexes. This is now a little bit of an old-fashioned point of view, weshould be using higher categories,
So this is the first gebra we’ve talked about. Here’s another example that I won’t go into in any detail. Suppose that G is a
topological group. Then the multiplication G × G → G induces a multiplicationon chains and homology, so that both of these are dgas. At the same time, G isstill a space, so chains and the homology form a dg coalgebra. You could do thesame thing on cochains with which one was which reversed. In any event, thesedo NOT have Frobenius compatibility, but have something else, called bialgebraor Hopf compatibility, that I’m not going to write down. If G is a Lie group thenthe homology is both a Hopf algebra and simultaneously a Frobenius algebra, andmaybe there’s even more compatibility.
The other example that I do want to go into in a little more detail is the example
of a Lie bialgebra. First, let me describe a Lie coalgebra. Just as in the case of acoassociative coalgebra, we get the definition of a Lie coalgebra by reversing all ofthe arrows.
Definition 7. A dg Lie coalgebra is a chain complex C along with a cobracket∆ : C → C ⊗ C which
(1) is skew-symmetric: σ∆ = −∆, and(2) satisfies the coJacobi relation that we get by turning Jacobi upside down:
Just like we combined a dg algebra with a dg coalgebra to get a Frobenius algebra
or a bialgebra, we can imagine a structure that has a Lie bracket and a cobracket.
Definition 8. A Lie bialgebra is a chain complex C along with a bracket {, } andcobracket ∆ so that
(1) {, } makes C a dg Lie algebra,(2) ∆ makes C a dg Lie coalgebra, and
(3) the bracket and cobracket satisfy Drinfel’d compatibility:We say that the Lie bialgebra is involutive if the bracket annihilates the cobracket:{, } ◦ ∆ = 0 or
Kate has shown you that this is the easiest structure that you get on the equi-
variant homology of the loop space, an involutive Lie bialgebra, and then I thinkshe’s aiming toward describing a richer structure on the equivariant chains thatinduces the involutive structure on homology.
4. Equivariant Chas-Sullivan String topology
There is also an equivariant version of Chas-Sullivan, it may be that the stuff we
did yesterday made you unhappy, I might be about to make you mad again. Thereare two ways I’m going to make you mad. I shouldn’t project. I’m going to use theloop product as we defined it yesterday. I’m going to do another thing in order todraw the pictures I want to draw. This is a really bad thing.
The naive thing is what I’m actually going to do. We said that naively, the
equivariant story was, naively, look at the S1 action on the loop space, and saythat the HS1 was equal to the naive quotient H∗(LM/S1). This can be a badlybehaved space if there are fixed points, where we construct the homotopy quotientand take the homology of that. I’m going to live in the naive world so that I canconnect back to things that we know. The things I’m going to say can be madeprecise in the actual world.
So, let’s think about what this space is. How does S1 act on it? What is the
orbit of γ ∈ LM ? It’s a family of loops parametrized by S1, so that they all havethe same image, I’m just rotating it. You have a set, an equivalence relation, youcan think of all the elments in an equivalence class, or an equivalence class itself. I might also think about this, represent the orbit by picturing an unmarked loop. The construction I’m going to show later takes one notion as its input and one asits output. A point in the space is an unmarked loop, Chas Sullivan call this astring space. A loop has a basepoint, a string doesn’t.
I’ll describe the real thing in a not-real way, now, but don’t worry. You can think
about the fibration given by taking the actual quotient or homotopy quotient byfitting homology into the exact sequence. I’m describing a real thing in a not-realway. One map goes H∗(LM ) → HS1
H∗+1(LM ), let’s call them E and M , standing for “erase” and “mark.” The pictureis to take a loop and forget the marking, and you’ve got an unmarked loop, a stringin the string space. If you have a marked loop, you have no canonical marking,you mark in all possible ways, so you go up by one. One remark: the BV operatoris the composition of these: ∆ = M ◦ E. That’s actually your BV operator. Theother composition E ◦ M = 0.
Using the loop product defined on ordinary homology, I can define something on
Definition 9. Let x and y be equivariant homology classes. I can define [x, y] =±E(M X • M Y ).
This gives me an |x| + 1 + |y| + 1 − d equivariant homology class, so this has
degree 2 − d. This is called the string bracket. If I do this in two dimensions, thisshould be degree 0. This satsifies the Jacobi identity. The equivariant homologywith the string bracket is a Lie algebra. This agrees with the Goldman bracket.
Now that we’ve gone through the construction with Sullivan chord diagrams, I
might think about unmarked loops or families of loops. Remember, for the non-equivariant version, if we mark in all possible ways, and intersecting in the marking,I have a family of diagrams that looks like this:
The markings at the top move around, and that’s an S1 × S1 family, so that’s T 2,and that’s the 2 of 2 − d. We can use these similar diagrams to discuss equivarianthomology operations, moving back and forth between the two ways of looking atthings. Mark in all possible ways on the input, do the non-equivariant operations,and then forget the markings on the output. This is a little more abstract becausethere is a whole family of diagrams, and this family that I have written leads tothe string bracket.
Because we did have that discussion, Gabriel brought up the Frobenius thing
here, this Lie algebra is something I am kind of okay with, there is some compat-ibility here, it’s Frobenius (Cohen-Godin). Then we have another diagram, chorddiagram, which gives ∆ : HS1
We cut and reconnected for the Goldman bracket, and then there was the Turaev
cobracket. We’re focusing on what’s happening at the endpoints of the chord, soit’s secretly basically the same thing. The picture is:
Theorem 2. The equivariant homology of LM with the string bracket and cobracketform an involutive Lie bialgebra. When d = 2, this agrees with Goldman-Turaev.
Thinking about Sullivan chord diagrams, we’ve got algebraic structures gener-
alizing the bracket and cobracket on equivariant homology. Given what we didearlier, I think it should be easy to imagine generalizing this to other structures. When it comes to surfaces, our fatgraphs giving us operations and surfaces, this is,well, here’s the difference, Sullivan chord diagrams are a marked metric fatgraph,there’s a marked cycle for each boundary component. For the non-equivariant case,we have these surfaces Σ and if I have a map from Σ into M , I can restrict to theinputs and outputs, the thing that was important, in order to land in the k-loopspace, there was a distinguished point. The ribbon surface for the chord diagramhas the marked point. You can do the obvious thing to parameterize with thecircle. Equivariantly, one way to look at it is by taking unmarked diagrams andlike I said, the annoying way of mixing things, I can mark the inputs but not theoutputs. That’s not a great punchline but I’ll stop there, take T k families of these. It’s probably easier to think about non-equivariant stuff. I’ll go back and forth alittle bit.
AZIENDA ULSS N. 3 DI BASSANO DEL GRAPPA LABORATORIO DI ANALISI ELENCO DELLE PRESTAZIONI SATURAZIONE TRANSFERRINA, PLASMA Provetta Tappo Verde Anello Giallo 5 ml (Litio Eparina con Gel Separatore) Note: viene eseguito solo il prelievo, solo per utenti esterni. Giorni di Esecuzione: Martedì e Giovedì. Tariffa Regionale: Vedi Tabella Allegata Inviare Paziente Esterno a: Labor
Why don’t you walk? David Lindelöw Lund University Dep. of Technology and Society P.O. Box 118 221 00 Lund Sweden Abstract The purpose of this paper is to review the literature and have a critical look at studies analyzing factors that influence walking. How does the propensity to walk change when a condition changes and which of the factors have a proven effect? A literature research