Jump to content
Science Forums

Recommended Posts

Posted

This is so much fun, but it will lead (I hope) to a bun-fight.

 

Suppose that [math]U,V,W[/math] are vector spaces, and that the linear operators (aka transformations) [math]f:U \to V,\,g:V \to W[/math].

 

Fix [math]U[/math]

 

Then we know that the composition [math]g \cdot f: U \to W[/math] as shown here.

 

Now, it is not hard to show that the set of all operators [math]U \to V[/math] is a vector space (usual axioms, naturally).

 

Let's call the vector space of all such operators [math]L(U,V)[/math] etc. Then I will have that [math]f \in L(U,V),\, g\in L(V,W),\,g \cdot f \in L(U,W)[/math] are vectors in these spaces.

 

The question naturally arises: what are the linear operators that act on these spaces? Specifically, what is the operator that maps [math]f \in L(U,V)[/math] onto [math]g \cdot f \in L(U,W)[/math]?

 

By noticing that here [math]U[/math] is fixed, and that [math]g: V \to W[/math], we may be tempted to suggest it is simply [math]g[/math]; but this cannot be, since the domain of [math]g[/math] is [math]V[/math], not [math]L(U,V)[/math].

 

Now I have available the notation [math]L(U,g): L(U,V) \to L(U,W)[/math], which has the distinct advantage of telling me exactly what's going on.. But, for reasons which I hope to make clear, I will use a perfectly standard alternative notation [math]L(U,g) \equiv g_\ast:L(U,V) \to L(U,W),\, g_\ast(f) = g \cdot f[/math].

 

Now, looking up at my diagram, I can think of this as pushing the tip of the f-arrow along the g-arrow to become the composite arrow.

 

Accordingly, I will call this the push-forward of [math]f[/math] on [math]g[/math], or, by a horrid abuse of English as we normally understand it, the push-forward of [math]g[/math]

 

So, [math]g_\ast(f) = g \cdot f[/math]; no real shocks here, right? Ah, just wait, the fun is yet to begin, but this post is already over-long, so I'll leave you to digest this for a while.........

Posted

Nothing to digest, so far standard GR or differential geometry if you want...but I actually like your interpretations, easier to interpret than the ones in the course I followed...

Posted

Fun Ben, :naughty: but you're getting me confused. B)

 

I'm not sure what you could mean by a fixed U, sounds more like you mean a fixed g and like you're hinting at duality... which would match up with the thread title.

Posted

I would think that the domain ---. *Edit* You are correct --- Nonetheless.

 

If f = Fu + Fv, and if g = Gv + Gw; then

 

fg = Fu + (Fv + Gv) + Gw; where Fv + Gv is of push-pull interest as it varies. But continue. (I am also unclear about convention g*(f) = f dot g; especially considering that the triangle appears to require only addition, and not dot or corss product.)

Posted

Oooo goody, a discussion!

 

Nothing to digest, so far standard GR or differential geometry if you want.
We are talking vanilla spaces here, over some arbitrary field. My construction will be true (I think) for any tangent space on any manifold, but this is not required..

 

you're getting me confused. I'm not sure what you could mean by a fixed U,
Well, I simply meant that, for some particular [math]U[/math], I want to consider all linear maps from this space to any other space,
sounds more like you mean a fixed g
No not at all. If I declare that U is fixed in that sense, and V, W are generic, then so is the mapping [math]g: V \to W[/math].

 

So no. the mapping [math] g: V \to W[/math] is an arbitrary mapping from an arbitrary space to another such space

and like you're hinting at duality... which would match up with the thread title.
I am - stay tuned

 

I would thinkg that the domain of g is W, not V.

 

and not dot..... product.)

No, the domain of g is V, as defined.

 

My "dot" was supposed to denote function composition - was that not obvious?

Posted

I got it--I think.

 

For for every f that maps U to V, there is g that maps V to W, such that f of g ultimately maps U to W. Conversely, g of f is equal to f of g? Does not that imply that f=g, and the U=V=W? Or, am I thinking too far ahead? (then you could have f and f-not; and g and g-not, for every U,V,W and U,V,W-not.)

Posted

I seem to have confused the issue by my use of the word "fixed". I simply meant that, for any single choice of [math]U[/math], I wanted to consider all possible spaces in the range of any arbitrary transformation from this [math]U[/math]

 

I called this [math]U[/math] a "fixed domain"

 

Notice also, that the usual notation for composing maps is: [math]g \cdot f[/math] means "do f first then do g". So....

 

....anyway, given [math]g: V \to W[/math] as a linear operator on vector spaces, we found [math]g_{\ast}: L(U,V) \to L(U,W),[/math] as the linear operator that maps [math]f \in L(U,V)[/math] onto [math]g \cdot f \in L(U,W)[/math], and called it the push-forward of [math]g[/math].

 

In fact let's make that a definition: [math] g_{\ast}(f) = g \cdot f[/math] defines the push-forward.

 

This construction arose because we were treating the space [math]U[/math] as a fixed domain. We are, of course, free to treat [math]U[/math] as as fixed codomain, like this.

 

 

This seems to make sense, certainly domains and codomains come into register correctly, and we easily see that [math]h \in L(W,U),\,\, h \cdot g \in L(V,U)[/math].

 

Using our earlier result, we might try to write the operator [math]L(g,U): L(W,U) \to L(V,U), \,\,h \mapsto h \cdot g[/math], but something looks wrong; [math]g[/math] is going "backwards"!

 

Nothing daunted, let's adopt the convention [math]L(g,U) \equiv g^{\ast}[/math]. (We will see this choice is no accident)

 

Looking up at my diagram, I can picture this a pulling the "tail" of the h-arrow back along the g-arrow onto the composite arrow, and accordingly (using the same linguistic laxity as before), call [math]g^{\ast}[/math] the pull-back of [math]g[/math], and make the definition:

 

[math]g^{\ast}(h) = h \cdot g[/math] defines the pullback (Compare with the pushforward)

 

This looks weird, right? But it all makes beautiful sense when we consider the following special case of the above.

 

 

where I have assumed that [math]\mathbb{F}[/math] is the base field for the vector spaces [math]V,\,W[/math]. As before, the composition makes sense, and I now have [math]\phi \in L(W, \mathbb{F}),\, \phi \cdot g \in (V, \mathbb{F})[/math], and the pullback [math]g^*: L(W,\mathbb{F}) \to L(V,\mathbb{F})[/math].

 

But, hey, lookee here....

 

[math]L(W, \mathbb{F}),\,\, L(V, \mathbb{F})[/math] are the vector spaces of all linear maps [math]W \to \mathbb{F},\,\,V \to \mathbb{F}[/math], so we quite simply have that [math] L(W,\mathbb{F}) = W^*,\,\,L(V, \mathbb{F}) = V^*[/math], the dual vector spaces.

 

Putting this all together I find that, for [math]g: V \to W[/math] I will have [math]g^{\ast}:W^{\ast} \to V^{\ast}[/math] as my pullback.

 

I say this is just about as nice as it possibly could be.

 

PS I trust you will all excuse my prolixity - it's Sunday pm, and I'm having such fun! And there is yet more fun to come; stay tuned........

Posted
I got it--I think.

 

For for every f that maps U to V, there is g that maps V to W, such that f of g ultimately maps U to W. Conversely, g of f is equal to f of g?

 

No, that is undefined, because f goes from U to V and g from V to W so you can have g of f because it goes like U -> V ->W.

 

f of g would go from V->W->U which would imply that [math]f \in \mathcal{L}(W,U)[/math] which is not the case.

Posted

Thank you Sanctus, I seem to have missed that last post of lawcat.

 

Anyway, assuming we masochists are all still on board, let's continue. For now, I'll stick with the pullback. We have a gadget that sends [math]V \to V^{\ast},\,\,\, g \to g^{\ast}[/math]. Now I could call this gadget [math]^{\ast}[/math] but that would probably freak you, and moreover would fail capture the general situation.

 

So, although I used the shorthand [math]g^{\ast} \equiv L(g,\mathbb{F})[/math], it will be easier from now on if I revert to the longhand version [math]g^{\ast} \equiv L(g, \mathbb{F}): L(W,\mathbb{F}) \to L(V,\mathbb{F})[/math].

 

Now this generalizes: for any [math]f: X \to Y,\,\, L(f,Z): L(Y,Z) \to L(X,Z)[/math] (the [math]X,\,Y[/math] are vector spaces, obviously)

 

This suggests the notation [math] ^{\ast} \equiv L(-, Z): f \to L(f,Z),\,\,\, X \mapsto L(X,Z)[/math]. This will take some explaining.

 

What I have here is some sort of gadget with an empty "slot", into which I can drop either an object, here a vector space, OR an mapping form one such object to another such, and find another sort of space with its own mappings. Such an object is called a functor; it maps objects to objects and maps to maps.

 

Moreover, since I will have have that, for any [math]f:X \to Y[/math] that [math] L(-,Z): f \to L(f,Z)[/math] such that [math] L(f,Z): L(Y,Z) \to L(X,Z)[/math], this functor is by definition a contravariant functor, i.e. it turns arrows around.

 

So, returning to my early example: if [math]L(-,\mathbb{F}): V \to L(V,\mathbb{F})\,\,(\equiv V^{\ast})[/math], does it make any sense to talk of a contravariant functor mapping a contravariant tensor onto a covariant tensor?

 

In like fashion I find that [math]L(U,-)[/math] is covariant functor, so does it make sense to talk about a covariant functor mapping a covariant tensor onto a contravariant tensor? (Note edits)

 

Think on this: if I declare that some function [math]f[/math] is a real-valued function, I simply mean that, whatever argument [math]x[/math] this function takes, [math]f(x) \in \mathbb{R}[/math], right? So why cannot I have that, in this spirit, whatever argument a contravariant functor takes, the result must be contravariant? Can a contravariant functor take a contravariant tensor as argument?

 

NO, so in my opinion, these terms are best avoided in the context of tensors.

 

Throw your bread rolls now.

Posted
Throw your bread rolls now.
:evil:

 

Putting this all together I find that, for [math]g: V \to W[/math] I will have [math]g^{\ast}:W^{\ast} \to V^{\ast}[/math] as my pullback.
Well, errr, unless I'm more confused than I think, I suppose we could also say that [math]g_{\ast}:V^{\ast} \to W^{\ast}[/math], could we?

 

What I have here is some sort of gadget with an empty "slot", into which I can drop either an object, here a vector space, OR an mapping form one such object to another such, and find another sort of space with its own mappings. Such an object is called a functor; it maps objects to objects and maps to maps.
Couldn't the functor just be called an application whose domain is the union of different kinds of structures?

 

:rainumbrella:

 

:fluffy:

Posted
Well, errr, unless I'm more confused than I think, I suppose we could also say that [math]g_{\ast}:V^{\ast} \to W^{\ast}[/math], could we?
Essentially you want pushforwards and pullbacks to be mutual inverses, which I think may be true in the general case that all spaces in sight are isomorphic.

 

In the present case, as is evident, I had a bit of a problem with this, since it can only accidentally be the case that a vector space is isomorphic to the field over which it is defined (generally, the vector space is smaller than its field). I so wanted it to be true, but I just couldn't force it to be. Look....

 

Suppose [math]V,\,W[/math] are isomorphic, which I will write as [math]V \simeq W[/math]. Then for any [math]g:V \to W[/math] I may have that [math]g^{-1}: W \to V[/math]. Then, as before, I will have the pullback [math](g^{-1})^{\ast}: V^{\ast} \to W^{\ast}[/math].

 

But [math]V \simeq W \Rightarrow V^{\ast} \simeq W^{\ast}[/math] so that I also have the inverse [math](g^{\ast})^{-1}[/math].

 

I am reasonably sure these two maps are the same, which would be nice: the pullback of the inverse equals the inverse of the pullback,

 

Sadly, neither of these guys is a pushforward as far as I can tell.

 

What we do have is the following:

 

Define the space of all maps [math]V^{\ast} \to \mathbb{F} \equiv V^{\ast \ast}[/math] called he double dual to [math]V[/math]. Then I can show that, for any pullback [math]g^{\ast}: W^{\ast} \to V^{\ast}[/math] that the mapping [math]g^{\ast \ast}: V^{\ast \ast} \to W^{\ast \ast}[/math] exists and is unique. It is the pullback of a pullback, which is a pushforward.

 

(Actually there a trick to double duals which is routinely used in the definition of a tensor)

 

Couldn't the functor just be called an application whose domain is the union of different kinds of structures?
Yes. I like to think of the word as being hybrid between a function (objects to objects) and an operator (functions to functions)
Posted

Garsh Ben yer gittin' me confused!

 

...since it can only accidentally be the case that a vector space is isomorphic to the field over which it is defined (generally, the vector space is smaller than its field).
By my distant memories of linear algebra, the first is true iff V is 1 dimensional and the second is never true. For more than 1 dimension V will be homomorphic onto F, not vice versa. :Glasses:

 

Yes. I like to think of the word as being hybrid between a function (objects to objects) and an operator (functions to functions)
Well, I just couldn't find any better bun at the moment! :oh_really: I always regard these words as being specific types of application.
Posted
By my distant memories of linear algebra, the first is true iff V is 1 dimensional and the second is never true. For more than 1 dimension V will be homomorphic onto F, not vice versa.
Eeek!! You are right, I boobed.

 

Theorem: Every n-dimensional vector space [math]V_n[/math] over [math]\mathbb{F}[/math] is isomorphic to [math]\mathbb{F}^n[/math].

 

*blush*

Posted

Aphorism of the week: If you dislike the taste of humble pie, don't attempt to do mathematics.

 

So after that slight humiliation, I have an idea how to proceed. I am going from the double dual space to tensors, which seems like an incongruent leap, but wait and see....

 

Recall the double dual to the vector space is the space of all linear maps [math]V^{\ast \ast}:V^{\ast} \to \mathbb{F}[/math]. Let's now specify our field is the field of real numbers, and dump the blackboard notation and write [math]R[/math] for this field.

 

Now, any 2 (or more) vector spaces are said to be isomorphic if their elements (i.e. vectors) can be expanded as the sum of scalar products of the same number of basis vectors. So [math]V \simeq W[/math] iff for [math]v= \sum_i \alpha^i e_i \in V[/math] and[math]w=\sum_i \beta^if_i \in W[/math] that the sets [math]\{e_i\},\, \{f_i\}[/math] have the same cardinality.

 

Now by construction this is true of any vector space and its dual, thus [math]V \simeq V^{\ast}[/math]. But this isomorphism depends on a particular choice of basis, by which I mean that if, for some choice of basis [math]v \in V[/math] maps invertibly onto [math]w \in V^{\ast}[/math], under a different choice of basis I may have that [math]v \in V[/math] maps invertibly onto [math]w' \ne w \in V^{\ast}[/math].

 

One says this isomorphism is not "natural".

 

Now if [math]V \simeq V^{\ast},\,\,V^{\ast }\simeq V^{\ast \ast}[/math] then by transitivity of this relation I will have that [math]V \simeq V^{\ast \ast}[/math]. It can be shown that this isomorphism is independent of the choice of basis in the sense of the above, and one says that this is a "natural isomorphism".

 

In this circumstance, one says that [math]V = V^{\ast \ast}[/math] "up to a natural isomorphism" and then conveniently forgets the "up to..." bit. Thus we have that [math]V^{\ast}: V \to R,\,\,\,\,V: V^{\ast} \to R[/math].

 

Then with a couple more definitions (Cartesian product and outer product), we will easily find our first tensor, but for now I am out of puff

Posted

Bun:

Now, any 2 (or more) vector spaces are said to be isomorphic if their elements (i.e. vectors) can be expanded as the sum of scalar products of the same number of basis vectors.
True if they are also over the same field.

 

Bun:

Now by construction this is true of any vector space and its dual, thus [math]V \simeq V^{\ast}[/math].
For a finite number of dimensions. Perhaps also for a countable number? Foggy memories, foggy memories...

 

What I do remember is that, even when V isn't self dual, it'll still be isomorphic to its double-dual.

See if I can figure it out again... :scratchchin:

Posted
Bun: True if they are also over the same field.
Good shot with your bun. You are right, but as I was was at pains in my last post to stipulate we were working over [math]R[/math], I caught it and ate it.

 

The following fact may be mildly disconcerting: the field [math]C[/math] of complex numbers, when considered as a vector space, is a REAL vector space, since the scalar coefficients will always be in [math]R[/math]. Wow!

 

So, by your note, we will have that [math]R^2 \simeq C[/math] as real vector spaces.

 

Now [math]R^2 \equiv R \times R[/math], the Cartesian product of vector spaces, whose elements are ordered pairs of the form [math](a,b) \ne (b,a)[/math] with the added requirement that vector addition and scalar multiplication make sense: so for all [math]v_i, w_i \in V[/math] that

 

[math](v_1,w_1) +(v_2,w_2) = (v_1+v_2,w_1+w_2)[/math] and

[math]\alpha(v_1,v_2) = (\alpha v_1, \alpha v_2)[/math].

 

Thus may I have, for any linear vector space [math]V \ni v,\,w[/math] that [math]V \times V \ni (v,w)[/math] as a NEW linear vector space.

 

Recall that [math]V:V^{\ast} \to R[/math] as a linear mapping, that is, for [math]\alpha \in R,\,\, \phi \in V^{\ast},\,\, v(\alpha \phi) = \alpha v(\phi)[/math]; the image of a scalar product is a scalar product of the image (by "scalar product" I do NOT mean inner product).

 

Let's now define a gadget [math]\otimes(V \times V)= V \otimes V: V^{\ast}

\times V^{\ast} \to R,\,\, (v \otimes w)(\phi,\psi) = v(\phi)w(\psi) \in R[/math] and note that it is bilinear, that is, linear in each argument taken separately

 

Know what? I am going to call the chap [math]v \otimes w [/math] a type (2, 0) tensor.

 

A parallel argument will give us [math]V^{\ast} \otimes V^{\ast}: V \times V \to R[/math] and [math]\phi \otimes \psi [/math] as a type (0, 2) tensor. In some ways this is the more interesting twin. If there's any interest I can show why.

 

So anyway, all we need to do now is clean up our notation a little, and add a few bells and whistles, and feel we've accomplished something

 

Bun: For a finite number of dimensions
That was a better shot - I was sloppy on that, you are right

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
×
×
  • Create New...