Lemma 13.35.7. Let \mathcal{A} be an abelian category. Let \mathcal{D} = D(\mathcal{A}). Let \mathcal{E} \subset \mathop{\mathrm{Ob}}\nolimits (\mathcal{A}) be a subset which we view as a subset of \mathop{\mathrm{Ob}}\nolimits (\mathcal{D}) also. Let K be an object of \mathcal{D}.
Let b \geq a and assume H^ i(K) is zero for i \not\in [a, b] and H^ i(K) \in \mathcal{E} if i \in [a, b]. Then K is in smd(add(\mathcal{E}[a, b])^{\star (b - a + 1)}).
Let b \geq a and assume H^ i(K) is zero for i \not\in [a, b] and H^ i(K) \in smd(add(\mathcal{E})) if i \in [a, b]. Then K is in smd(add(\mathcal{E}[a, b])^{\star (b - a + 1)}).
Let b \geq a and assume K can be represented by a complex K^\bullet with K^ i = 0 for i \not\in [a, b] and K^ i \in \mathcal{E} for i \in [a, b]. Then K is in smd(add(\mathcal{E}[a, b])^{\star (b - a + 1)}).
Let b \geq a and assume K can be represented by a complex K^\bullet with K^ i = 0 for i \not\in [a, b] and K^ i \in smd(add(\mathcal{E})) for i \in [a, b]. Then K is in smd(add(\mathcal{E}[a, b])^{\star (b - a + 1)}).
Comments (0)