I don't think animals are more important than humans, but I do feel that we as a society have a greater responsibility to the animals we've domesticated than we currently show. Not only did we create their current place in our lives (whether a pet or an animal raised for consumption/production) but they, like children (who we also have a greater responsibility for), can't do much to alter their own circumstances. So it's up to us to make sure they're treated well and not abused (even with animals raised for food, there's a humane way and a wrong way to slaughter them), but a lot of that responsibility is tossed away due to greed, meanness, or just plain stupidity.
*insert appropriate Spider-man quote here*
Pan Female, Hinge in a V between my mono (straight) husband, Monochrome and my poly (pan) partner, ThatGuyInBlack