字幕列表 影片播放 列印英文字幕 This video is remarkable. You're hearing these words months, maybe years, after I've spoken them, yet everything is as clear as if we were sitting in the same room. The ability to record and transmit different kinds of information is a core part of modern engineering – and the world as we know it. That's why some say we're living in the information age. Whether you're using your phone, turning on the radio, or strumming your electric guitar, you're sending and receiving signals all the time. And to get all that information where it needs to go, you'll need signal processing. [Theme Music] As an engineer, communicating means more than having a chat in the break room. Whether you're watching YouTube videos, using satellite navigation, or just making a phone call, there's communication happening. Signals are representations of the information we're sending when we do this. Text, sounds, images, and even computer files will all be converted into a signal when you send them. And that's really what communication is, sending stuff from one place to another to convey information. The basic task is to take content, turn it into a signal, transmit it, and then turn it all back into content on the other end. These steps are known as signal processing. The signal itself will be a current running through a wire or an electromagnetic wave, like radio or light. However you choose to relay it, the overall process is basically the same. The problem of communicating remotely is one engineers faced long before digital computers came onto the scene. We saw an example of this in the history of electrical engineering, with Samuel Morse's 1837 telegraph. In his design, the operator pushed down a lever, called a key, to complete a circuit and transmit an electric current down a wire. At the other end, a machine called a register would receive that current and mark a piece of paper. By pressing down the key for different lengths of time, the operator could make the register draw little dots and dashes that spelled out a message. The key and register in Morse' telegraph are both examples of what are called transducers. Transducers take physical information, like the operator's press of the lever, and turn it into a signal or vice versa. To record this video, for example, the input transducers were the microphone and the camera I'm speaking to, which measures the sound and light in this environment and converted them to electrical signals. Watching the video involves output transducers, things like your headphones and monitor. Unlike Morse' system, however, the signal won't stay in one form between transducers. It might start out as an electric current in the camera that gets converted into a file on a memory card. That's transmitted again as a signal when we send the file to a computer or upload it to the internet, where it's stored on YouTube's servers. At least, until you request that the signal be sent to you in its final form, to be converted back into light and sound. Morse's system was popular because it was simple and remarkably easy to use, ushering in the era of instant communication we enjoy today. The ingenious part was finding a way to take information as people understand it, in terms of ordinary letters and words, and encode it in a form that could be transmitted as electricity. Encoding is a key part of signal processing. Signals need a transmission-friendly way of representing the information you're trying to relay. A hundred years after Morse unveiled his telegraph, it was replaced by more sophisticated and convenient forms of communication, like telephones and radios. But these methods – and everything up to the internet today – are still based on encoding. It's the way the information is encoded and how it's transmitted that's changed. Consider radio waves, like the kind used to transmit signals between your phone and a cell tower. It's the wave nature of radio that lets your phone encode the information you need to make a call. Engineers design hardware that changes, or modulates, the behavior of that wave to encode information about the pressure of the air near the microphone – in other words, the physical effects of sound. Two of the most common ways of doing this are Amplitude Modulation and Frequency Modulation, or AM and FM – that's where the names on your radio dial come from! One adjusts the amplitude, or strength of the wave, while the other changes the frequency, or distance between one peak and the next. Much like telegraph signals, the transmitted wave carries the information you want, which is then decoded on the other side. Similar methods can even represent sounds and images, which is how television broadcasts work. But these methods have two pretty big limitations! The first is capacity. The signal of a radio wave can be thought of as a combination of other, simpler waves put together. Specifically, you can represent a signal as the sum of radio waves with different frequencies. The range of different frequencies you can represent is called the bandwidth, and it limits how much information can be encoded by your signal, as well as how many of them can be sent at the same time. Think of signals as fluids and radio channels as pipes; the bandwidth is like the size of the pipe, which controls how much fluid can flow at once. The other problem is noise. As they travel through the atmosphere, radio waves interfere with each other and are warped by objects in their path, which both cause distortions. So the signal the other person receives usually ends up pretty different from the one that you sent! Noise is anything that changes your signal from its original form, usually in a random way. The greater the noise, the more distorted and unrecognizable the received message will be. That's why old TV sets sometimes ended up with 'static' in the image! To go back to the pipe analogy, noise would be any contamination the pipe puts into the fluid, changing its concentration. A tiny, contaminated pipe does a pretty terrible job of delivering lots of clean water. So as you can imagine, noisy channels with low bandwidth aren't great for sending signals that can be reliably decoded on the receiving end. Worse still, both of these problems happen for wired communications as well. The signal traveling down a wire is also a wave, where the amplitude is represented by the the power of the electric current at any given point in time. That's how we modulate electric currents to carry signals, but it also means that those signals suffer from noise and capacity issues, too. Radio and wired communications faced these sorts of problems during World War II, which brought them to the attention of engineer and mathematician Claude Shannon. In 1948, he published A Mathematical Theory of Communication, which revolutionized how engineers consider information itself, and what it takes to send information reliably. Among Shannon's contributions was a mathematical formula for determining the conditions needed for sending a signal at a particular rate. Imagine sending a Morse Code message down a noisy wire. Each segment of the code represents a dot or a dash, what you might call a “bit” of the message. “Bit” stands for “binary digit”, because each part of our message only occupies one of two states. In his paper, Shannon developed a formula that determines the number of bits you can transmit per second, or “bit-rate” – given the power of your signal, the amount of noise, and the bandwidth of the channel. When your internet provider advertises a speed of 50 megabits per second, that's Shannon's bit rate! He figured out that it's the ratio of the power of the signal to the power of the noise that determines the bit rate. So either the signal needs to be strong enough, or the bandwidth needs to be large enough for there to be so many frequencies representing the signal that noise can't affect them all at once. As well as this handy formula, Shannon laid out lots of groundwork for calculating the exact conditions needed for reliable communication. Just as importantly, he worked out what kinds of signals you might need to represent the information you're trying to communicate. That work would be vital once signal processing entered the digital age. Digital signals represent information using a small set of distinct states rather than the continuous variation of a wave. Instead of FM radio, where changes in frequency translate exactly to changes in sound, digital radio sends the data piece by piece and everything is reassembled on the receiving end. Because the different states of the signal can be more distinct, they're much less susceptible to noise. A large difference is easier to distinguish than a small one, even when it gets distorted. Morse code, with its dots, dashes, and spaces, was an early digital communication system. But it would take the advent of computers for digital signaling to really take off. And it was Shannon's work that allowed computer scientists and electrical engineers to find ways of encoding different kinds of information in terms of 1s and 0s – what we now call binary code. Digital signals have come to form the basis of computing, and every form of data associated with it. All of which are still used today! Of course, we've only just skimmed the surface. Signal processing overlaps with some serious technical challenges. There's the task of actually encoding different sorts of information as signals, and creating channels like phone lines and WiFi routers to transmit them. And there's the challenge of building hardware that transmits the final output, like computer monitors and headphones. But the end result is that you can stream videos like this one at the click of a button, virtually anywhere in the world. I might be a little biased, but I think that it's pretty darn cool. In this episode, we looked at the fundamentals of signal processing. We saw the need to represent information as a signal so it can be transmitted, and an example of that in Morse Code. We explain how wired and wireless communications can suffer from the problems of bandwidth capacity and noise, and how Claude Shannon helped quantify the problem so that engineers could build around those limitations and bring about the digital age. Next time, we're headed out to sea to talk about moving physical objects with ships and marine engineering. Crash Course Engineering is produced in association with PBS Digital Studios, which also produces It's Okay To Be Smart, a show about our curious universe and the science that makes it possible, hosted by Dr. Joe Hanson. Check it out at the link in the description. Crash Course is a Complexly production and this episode was filmed in the Doctor Cheryl C. Kinney Studio with the help of these wonderful people. And our amazing graphics team is Thought Cafe.
B1 中級 沒有通信和信號處理,YouTube就無法存在。速成工程#42 (YouTube Couldn't Exist Without Communications & Signal Processing: Crash Course Engineering #42) 8 0 林宜悉 發佈於 2021 年 01 月 14 日 更多分享 分享 收藏 回報 影片單字