For the most part, romance novels are stories about women finding and taking up space for themselves. And not just taking up space, but daring to find happiness. And yes, romance novels are about the fantasy-the heterosexual fantasy-of having the perfect relationship with a man, but it's also about women taking power over their sexuality, women taking control over their lives, women making themselves vulnerable to all the intimacies of love...