TRENDING NEWS

POPULAR NEWS

Do Different Alarm Tunes In An Alarm Clock Utilize Different Soundwave Frequencies

Sound from the siren of an emergency vehicle has a frequency of 750.0 Hz and moves with a velocity of 343.0 m/?

Sound from the siren of an emergency vehicle has a frequency of 750.0 Hz and moves with a velocity of 343.0 m/s. What is the distance from one condensation to the next?

How does amplitude and frequency affect the pitch of a sound wave?

Pitch is sometimes defined as the fundamental frequency of a sound wave (i.e. generally, the lowest frequency in a given sound wave). For most practical purposes, this is fine, and pitch and frequency can be thought of as equivalent. On the other hand, for most practical purposes, amplitude can be thought of as volume.However, technically, pitch (and volume) are human perceptions. Thus, our perception of pitch and volume are not solely based on frequency and amplitude respectively, but are based on a combination of both (and even other factors). Frequency overwhelming dictates perceived pitch, but amplitude also does have some small, small effect on our pitch perception, especially when it is very large. For example, a very loud sound can have a different perceived pitch than you would predict from its frequency alone.That all being said, usually these effects are negligible, and pitch can be thought of as equivalent to fundamental frequency.

What devices use sound waves?

Besides audio equipment,
sonar: echo location under water and in air,
ultrasound: medical imaging,
kidney stone treatment: shock waves crush kidney stones,
seismology: earth quakes and volcano eruptions, detecting large explosions from far away,
dental equipment: Ultrasonic Scaler etc. ,
ultrasonic cleaners used mostly to clean jewelry
ultrasonic humidifiers create a water mist with ultrasound

and of course any mechanical design has to meet vibration control criteria, including the sway of tall buildings and bridges. Any periodic movement is technically a sound wave, just at a very low frequency.

Will playing high frequency sounds damage my phone speaker?

Probably not.that higher frequency could have the same energy as ones your phone makes regularly. The only risk in playing a frequency this high might be that without hearing it, you have no idea how “loud” it is, and how hard your phone is working to produce it.but I doubt that our phones are capable of producing signals that will destroy their speakers. especially because high frequencies are the only ones they produce well.I also think that a 22Khz sound loud enough to damage any speaker would be painful to dogs. in testing the app, he would’ve found a good volume for dogs, which would be perfectly safe for the phone.

Which high frequency sounds can dogs hear? Does that frequency hurt dogs' ears? How?

Dogs are renowned for their superior sense of smell, but they’ve got pretty sharp ears too. Compared to a human, a dog’s hearing range is approximately twice as wide. Dogs typically can detect sounds between 67-45,000 Hz, while humans can detect sounds between 64-23,000. In the upper frequencies of a dog’s hearing range, the sounds can cause a dog irritation and discomfort.Canine Hearing RangeWhile dogs are capable of hearing higher frequencies than humans, they by no means have the widest hearing range. Bats and whales can hear sounds up to 110,000 Hz, but are less capable of detecting lower range sounds.Volume Plus Frequency Equals DiscomfortIt is not merely frequency that causes a sound to be uncomfortable for a dog. The sound must reach a certain volume too. At sufficient volumes, frequencies above 25,000 Hz become irritating for dogs. The louder and higher those sounds are, the more uncomfortable for the dog they become. Dogs may whimper, whine and run away if confronted with a sufficiently loud and high-frequency sound.Practical ApplicationsHumans use high-frequency sounds to deter dogs from approaching, to distract dogs from misbehaving and to call them. Personal dog deterrents rely on blasting a loud, high-frequency sound to confuse, startle and irritate a dog. These sounds cause no permanent hearing damage and once the dog is out of range, he will settle down. In some scenarios, it’s important for an owner to signal to a dog using a sound that is distinctive. Dog whistles are extremely high frequency, in some cases, they are higher than 23,000 Hz and are inaudible to human ears. These sounds cut through ambient noise and are more easily discernible for dogs.It’s not All BadSounds of a frequency between 23,000-25,000 Hz are inaudible to humans, but are tolerable for dogs. In some cases, the sounds are appealing to dogs because they are distinct from the familiar range of sounds present in the human environment. Pet food manufacturers have experimented with including sounds in this frequency range in their adverts to attract dogs to the television set.

What are EQ, reverb, virtualizer, and bass boost?

EQ or equalizer allows you to change the max of which a certain frequency can reach. Most EQs will have high, mid, and low. High-end is the lighter, drier, frequencies usually found in the top end of vocals, synths, and drum crashes and hi-hats. Mid is the middle, clearer, frequencies where most of the outputted sound is found. While low end is the wetter, bassier, foundation'y type sounds, like found in kick drum, and dubstep basses. Adjust those to suit your style of music. In another words pop would have more mid, a touch of high and regular low, electro/techno would have mid and a touch of bass. Just play around with them till you hit the right sound your looking for in those three areas.

Reverb is like echo i suppose. Imagine taking the sound wave and allowing it to vibrate more, making it a wetter, more flowing sound.

bass boost is pretty self explanatory, it boosts the low-end.

and im not entirely sure what a virtualizer is aside from that its used to create a sense of spaciousness in audio. If you meant to type visualizer then that is just some imagery that moves to the sound to make music more fun to watch and hear.

How is sound encoded digitally and vice-versa?

A microphone consists of a membrane, a magnet, and a wire. When sound waves hit the membrane, it vibrates, moving a magnet. The moving magnetic field induces an electrical charge in the wire. Proprotional to the force moving the magnet (electrical generators work by spinning wires through a magnetic field to create electricity, this is the same principle but on a tiny scale).The electrical charge travels through the wire to a microchip called an analog-to-digital converter (ADC). The ADC has a circuit inside that converts the input voltage into a number (which are represented by a certain number of bits), which is read by the computer a certain number of times per second (the sample rate). The computer stores this digital representation of the electrical voltage changes cause by the vibrating membrane.When the computer wants to play the sound back, it simply reverses the process, sending bits to a digital-to-analog converter that will put out a voltage proportional to a number sent to it at a certain frequency, causing a membrane (in this case a speaker, which consists of a wire, magnet and membrane) to vibrate and produce a sound.NOTE: This is kind of a simplified description of the basic process. There are, in fact, a number of ways of representing sound waves. For example, just as there is AM and FM radio (amplitude modulation and frequency modulation), you can use different ways of representing the sound coming in form the microphone (and going out). Also, there are speakers and microphones that rather than use a membrane and magnet use something called the piezoelectric effect, which is where mechanical stresses on a material cause electrical changes in it (and vice-versa).

TRENDING NEWS