Given many feelings towards our media today, as well as the recent events with the hurricane, I feel I should bring one of the major arguments about the journalistic trade and see how members feel about it.
The question is, should the media make the news and take part in the news, or should they simply stand back and report the news.
The public seems extremely fickle on the subject, as the media has received flak for acting in either way. For example, in my state a politician well-known for his anti-gay rights stance was suspected of propositioning boys for sex over the internet. One of the local media outlets hired an investigator, who entered a gay chatroom the politician supposedly frequented with false information posted in his profile. The investigator did not in any way attempt to contact the politician, but the politician instead contacted him and propositioned him.
This was reported by the media and the ensuing storm essentially destroyed the politician's reputation. Many people are saying, however, that the media had no right to go out and "make" the news like that, that they should only be reporting, not taking any kind of action aside from that.
However, in times of great need, many journalists have been on location and have shot some of the most horrific scenes. The public outrage at this is that they are taking pictures and writing stories and not helping the people in need (which would be taking part in the news, making the news, so to speak).
So what do you think? Is the duty of the media to simply report and not take part in the world around them, or are they to take an active part in the news that they are reporting?