Suppose you are introducing a new concept, and you start with (what you think is) a toy example. Is it okay to refer to your example as a "trivial problem?"
Does this depend on the field and/or level of the class you are teaching?
Answer
"Trivial" often means "too simple to be a real problem". For example, in an class on optimisation methods years ago, the lecturer said "for our first example, we're going to study the problem of maximising the number of ones in a binary string of length n". My thought for most of the class was "well, that's bloody obvious—you just write n ones in a row—why are we looking at this". I eventually realised that it was being used because it made it easy to describe how the algorithms worked, not because it was a challenging problem in any way.
I think that's a good example of where the word "trivial" is good shorthand—the problem isn't of any importance, but it is being used to illustrate a point. This is similar to how trivial is used in a lot of pure mathematics, where it often means "the simplest example that satisfies the definition" (like "the trivial group"). As with all such jargon, though, it is worth explaining to students what you are going to be meaning by the word "trivial" and why trivial examples are worth looking at at all.
No comments:
Post a Comment